WraAct: Convex Hull Approximation for General Activation Functions¶
400X faster with 150X better precision and 50% fewer constraints
What is WraAct?¶
Modern neural networks use diverse activation functions beyond ReLU—Sigmoid, Tanh, MaxPool, ELU, and others. Verifying networks with these activations is challenging because existing methods either:
Only handle ReLU efficiently
Use generic approximations that are too loose
Scale poorly to multiple neurons
WraAct provides efficient tight over-approximations for general activation functions by introducing a novel geometric approach based on double-linear-piece (DLP) functions.
The core idea: Use linear constraints to smooth out fluctuations in the target function, simplifying the local geometry. This enables fast, precise convex hull approximation across diverse activation types.
WraAct achieves 400X faster efficiency on average compared to state-of-the-art methods while improving precision by 150X and reducing constraints by 50%. It can verify ResNets with 22,000 neurons in just 1 minute.
Key Innovation
WraAct leverages double-linear-piece (DLP) functions to divide the input domain into regions where the target activation function exhibits simplified geometry. By constructing convex hulls locally and combining them, WraAct achieves both efficiency and precision for complex activation functions that previous methods struggle with.
Supported Activation Functions¶
WraAct provides native support for a wide range of activation functions commonly used in modern neural networks:
Specialized handling inherited from WraLU
S-shaped smooth activation
Hyperbolic tangent activation
Piece-wise linear pooling
Exponential Linear Unit
ReLU with negative slope
Extensibility
The DLP-based framework is generalizable to other activation functions. If your activation function has piece-wise behavior or can be approximated locally, WraAct’s methodology can be adapted to support it.
Key Features & Benefits¶
First unified approach for diverse activation functions. No need for separate tools or methods for each activation type.
400X faster on average than SBLM+PDDM. Scales to 8 dimensions in 10 seconds, enabling practical verification workflows.
150X better precision on average. Tighter bounds lead to more successful verification and fewer false negatives.
50% fewer constraints on average. Reduces memory usage and speeds up downstream LP/MILP solving.
How It Works¶
WraAct uses a divide-and-conquer strategy based on double-linear-piece (DLP) functions:
Identify DLP functions that partition the input domain into regions. In each region, the target activation function exhibits simplified local geometry—either S-shaped or ReLU-like behavior.
Within each region, apply specialized approximation strategies:
S-shaped regions: Smooth, predictable curvature enables tight linear over-approximation
ReLU-like regions: Leverage WraLU’s wrapping approach for piece-wise linear functions
Combine local approximations into a unified global polytope. The result is an H-representation (half-space constraints) ready for integration into verification frameworks.
This approach achieves both geometric precision (tight bounds) and computational efficiency (fast execution).
Performance Highlights¶
WraAct demonstrates dramatic improvements over state-of-the-art methods across 100 benchmark samples:
Method |
Speed (relative) |
Precision (relative) |
Constraint Count |
|---|---|---|---|
WraAct (ours) |
400X faster |
150X better |
↓ 50% |
SBLM+PDDM |
1X (baseline) |
1.0X (baseline) |
High |
Generic Relaxation |
Fast |
Poor (~100X worse) |
Low |
Scalability to Large Networks
WraAct successfully verifies ResNet architectures with 22,000 neurons in just 1 minute. This represents a significant leap in scalability, making formal verification practical for production-scale deep networks that were previously out of reach.
Comparison with WraLU¶
While WraLU specializes in ReLU activation functions, WraAct generalizes the wrapping approach to diverse activation types:
Aspect |
WraLU |
WraAct |
|---|---|---|
Activation Functions |
ReLU only |
ReLU, Sigmoid, Tanh, MaxPool, ELU, Leaky ReLU, and more |
Core Technique |
Lower/upper face wrapping |
DLP-based domain division + local wrapping |
Performance |
10X-10⁶X faster than exact |
400X faster, 150X better precision (avg) |
Use Case |
ReLU-only networks |
Modern networks with diverse activations |
Which Tool Should I Use?
For ReLU-only networks: Either tool works, but WraLU is optimized specifically for this case
For networks with Sigmoid, Tanh, MaxPool, or other activations: Use WraAct
For mixed architectures: WraAct handles all activation types in a unified framework
Getting Started¶
To use WraAct in your verification workflow:
Visit the GitHub repository: Trusted-System-Lab/WraAct
Follow installation instructions in the README
Integrate with your verification framework (supports standard interfaces)
Configure activation-specific settings for optimal performance
For a unified Python implementation with easy integration into PyTorch workflows, see the wraact library.
Publications & Resources¶
- Conference Paper
Zhongkui Ma, Zihan Wang, Guangdong Bai. “Convex Hull Approximation for Activation Functions”. ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA’25), Singapore, October 2025.
Venue: OOPSLA’25 within SPLASH’25 (Top-tier PL conference, CORE A*)
Artifact: https://doi.org/10.5281/zenodo.17007119
- Presentations
OOPSLA’25 Presentation: Thursday, October 16, 2025, 16:00-16:15 at Orchid West, Marina Bay Sands Convention Centre, Singapore
SPLASH’25 Conference: October 2025
- Code & Artifacts
GitHub Repository: Trusted-System-Lab/WraAct
Unified Implementation: wraact library (regularly maintained)
Zenodo Artifact: https://doi.org/10.5281/zenodo.17007119
BibTeX Citation
@inproceedings{ma2025convex,
author = {Ma, Zhongkui and Wang, Zihan and Bai, Guangdong},
title = {Convex Hull Approximation for Activation Functions},
booktitle = {Proceedings of the ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications},
series = {OOPSLA '25},
year = {2025},
doi = {10.1145/3763086},
publisher = {ACM},
address = {Singapore}
}
Comments & Discussion