WraLU: Fast and Precise ReLU Hull Approximation¶
10X-10⁶X faster verification with 50% fewer constraints
What is WraLU?¶
Neural network verification requires computing tight bounds on neuron activations to certify robustness properties. For ReLU networks, multi-neuron constraints are essential for precision, but exact convex hull computation is prohibitively expensive—scaling exponentially with the number of neurons.
WraLU solves this challenge by computing fast and precise over-approximations of ReLU hulls. Instead of computing the exact convex hull, WraLU wraps the hull with a carefully constructed polytope that:
Reuses the linear pieces of the ReLU function as lower faces
Constructs upper faces that are adjacent to these lower faces
Minimizes constraints while maintaining tight bounds
This geometric approach achieves 10X to 10⁶X speedup over exact methods while producing up to 50% fewer constraints than state-of-the-art relaxations.
Key Insight
By exploiting the piece-wise linear structure of ReLU functions, WraLU formulates a convex polytope that “wraps” the ReLU hull. The lower faces come directly from the ReLU linear pieces, and upper faces are constructed to be adjacent to these lower faces—creating a tight approximation with minimal overhead.
Key Features & Benefits¶
10X-10⁶X faster than exact convex hull methods. Scales to 8 dimensions in just 10 seconds, enabling practical verification of modern networks.
Achieves 0.4X-0.7X precision relative to state-of-the-art SBLM+PDDM methods, with tighter bounds than triangle relaxation.
Reduces constraints by up to 99% vs exact methods and up to 30% vs SBLM+PDDM, speeding up verification workflows.
Successfully verifies modern ResNets and CNNs with tens of thousands of neurons that other methods cannot handle.
How It Works¶
WraLU approximates the convex hull of ReLU activations through a two-phase geometric construction:
Directly reuse the linear pieces from the ReLU function as the lower faces of the wrapping polytope. This ensures the approximation stays as tight as possible to the actual hull.
Construct upper faces that are adjacent to the lower faces by identifying critical edge-vertex pairs. These faces form supporting hyperplanes that complete the polytope wrapping.
The result is a convex polytope represented in H-representation (half-space constraints) that can be directly integrated into verification frameworks like PRIMA.
Performance Highlights¶
WraLU demonstrates significant improvements over existing methods across multiple benchmarks:
Method |
Speed (relative) |
Precision (relative) |
Constraint Count |
|---|---|---|---|
WraLU (ours) |
10X-10⁶X faster |
0.4X-0.7X |
↓ 50% |
Exact Hull |
1X (baseline) |
1.0X (exact) |
High |
Triangle Relaxation |
Fast |
~1.0X |
Moderate |
SBLM+PDDM |
Slow |
1.0X (exact) |
High |
Real-World Impact
WraLU has been successfully integrated into the PRIMA verifier, enabling verification of production-scale neural networks including MNIST and CIFAR-10 models with thousands of neurons. Verification tasks that previously took hours or were infeasible now complete in seconds to minutes.
Getting Started¶
To use WraLU in your verification workflow:
Visit the GitHub repository: Trusted-System-Lab/WraLU
Follow installation instructions in the README
Integrate with PRIMA or your preferred verification framework
Configure neuron grouping for your network architecture
For a unified Python implementation supporting multiple activation functions, see the wraact library.
Publications & Resources¶
- Conference Paper
Zhongkui Ma, Jiaying Li, Guangdong Bai. “ReLU Hull Approximation”. The 51st ACM SIGPLAN Symposium on Principles of Programming Languages (POPL’24), London, UK, January 2024.
Venue: POPL’24 (Top-tier PL conference, CORE A*)
- Presentations
POPL’24 Live Presentation: Watch on YouTube (January 2024, London, UK)
ADC’24 Distinguished Tutorial: “Robustness Verification of Neural Networks using WraLU” (December 2024, Gold Coast, Australia)
FPBench Community Meeting: May 2024
Formal Methods Workshop: University of Queensland, May 2024
- Code & Artifacts
GitHub Repository: Trusted-System-Lab/WraLU
Unified Implementation: wraact library (regularly maintained)
BibTeX Citation
@inproceedings{ma2024relu,
author = {Ma, Zhongkui and Li, Jiaying and Bai, Guangdong},
title = {ReLU Hull Approximation},
booktitle = {Proceedings of the 51st ACM SIGPLAN Symposium on Principles of Programming Languages},
series = {POPL '24},
year = {2024},
doi = {10.1145/3632917},
publisher = {ACM},
address = {London, United Kingdom}
}
Comments & Discussion