WraAct: Convex Hull Approximation for General Activation Functions

400X faster with 150X better precision and 50% fewer constraints

What is WraAct?

Modern neural networks use diverse activation functions beyond ReLU—Sigmoid, Tanh, MaxPool, ELU, and others. Verifying networks with these activations is challenging because existing methods either:

  • Only handle ReLU efficiently

  • Use generic approximations that are too loose

  • Scale poorly to multiple neurons

WraAct provides efficient tight over-approximations for general activation functions by introducing a novel geometric approach based on double-linear-piece (DLP) functions.

The core idea: Use linear constraints to smooth out fluctuations in the target function, simplifying the local geometry. This enables fast, precise convex hull approximation across diverse activation types.

WraAct achieves 400X faster efficiency on average compared to state-of-the-art methods while improving precision by 150X and reducing constraints by 50%. It can verify ResNets with 22,000 neurons in just 1 minute.

Key Innovation

WraAct leverages double-linear-piece (DLP) functions to divide the input domain into regions where the target activation function exhibits simplified geometry. By constructing convex hulls locally and combining them, WraAct achieves both efficiency and precision for complex activation functions that previous methods struggle with.

Supported Activation Functions

WraAct provides native support for a wide range of activation functions commonly used in modern neural networks:

🔲 ReLU

Specialized handling inherited from WraLU

📈 Sigmoid

S-shaped smooth activation

📉 Tanh

Hyperbolic tangent activation

⬇️ MaxPool

Piece-wise linear pooling

🔄 ELU

Exponential Linear Unit

⚡ Leaky ReLU

ReLU with negative slope

Extensibility

The DLP-based framework is generalizable to other activation functions. If your activation function has piece-wise behavior or can be approximated locally, WraAct’s methodology can be adapted to support it.

Key Features & Benefits

🌐 Generality

First unified approach for diverse activation functions. No need for separate tools or methods for each activation type.

⚡ Speed

400X faster on average than SBLM+PDDM. Scales to 8 dimensions in 10 seconds, enabling practical verification workflows.

🎯 Precision

150X better precision on average. Tighter bounds lead to more successful verification and fewer false negatives.

📉 Efficiency

50% fewer constraints on average. Reduces memory usage and speeds up downstream LP/MILP solving.

How It Works

WraAct uses a divide-and-conquer strategy based on double-linear-piece (DLP) functions:

1️⃣ Domain Division via DLP Functions

Identify DLP functions that partition the input domain into regions. In each region, the target activation function exhibits simplified local geometry—either S-shaped or ReLU-like behavior.

2️⃣ Local Convex Hull Approximation

Within each region, apply specialized approximation strategies:

  • S-shaped regions: Smooth, predictable curvature enables tight linear over-approximation

  • ReLU-like regions: Leverage WraLU’s wrapping approach for piece-wise linear functions

3️⃣ Global Combination

Combine local approximations into a unified global polytope. The result is an H-representation (half-space constraints) ready for integration into verification frameworks.

This approach achieves both geometric precision (tight bounds) and computational efficiency (fast execution).

Performance Highlights

WraAct demonstrates dramatic improvements over state-of-the-art methods across 100 benchmark samples:

Method

Speed (relative)

Precision (relative)

Constraint Count

WraAct (ours)

400X faster

150X better

↓ 50%

SBLM+PDDM

1X (baseline)

1.0X (baseline)

High

Generic Relaxation

Fast

Poor (~100X worse)

Low

Scalability to Large Networks

WraAct successfully verifies ResNet architectures with 22,000 neurons in just 1 minute. This represents a significant leap in scalability, making formal verification practical for production-scale deep networks that were previously out of reach.

Comparison with WraLU

While WraLU specializes in ReLU activation functions, WraAct generalizes the wrapping approach to diverse activation types:

Aspect

WraLU

WraAct

Activation Functions

ReLU only

ReLU, Sigmoid, Tanh, MaxPool, ELU, Leaky ReLU, and more

Core Technique

Lower/upper face wrapping

DLP-based domain division + local wrapping

Performance

10X-10⁶X faster than exact

400X faster, 150X better precision (avg)

Use Case

ReLU-only networks

Modern networks with diverse activations

Which Tool Should I Use?

  • For ReLU-only networks: Either tool works, but WraLU is optimized specifically for this case

  • For networks with Sigmoid, Tanh, MaxPool, or other activations: Use WraAct

  • For mixed architectures: WraAct handles all activation types in a unified framework

Getting Started

To use WraAct in your verification workflow:

  1. Visit the GitHub repository: Trusted-System-Lab/WraAct

  2. Follow installation instructions in the README

  3. Integrate with your verification framework (supports standard interfaces)

  4. Configure activation-specific settings for optimal performance

For a unified Python implementation with easy integration into PyTorch workflows, see the wraact library.

Publications & Resources

Conference Paper

Zhongkui Ma, Zihan Wang, Guangdong Bai. “Convex Hull Approximation for Activation Functions”. ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA’25), Singapore, October 2025.

Presentations
  • OOPSLA’25 Presentation: Thursday, October 16, 2025, 16:00-16:15 at Orchid West, Marina Bay Sands Convention Centre, Singapore

  • SPLASH’25 Conference: October 2025

Code & Artifacts
BibTeX Citation
@inproceedings{ma2025convex,
  author = {Ma, Zhongkui and Wang, Zihan and Bai, Guangdong},
  title = {Convex Hull Approximation for Activation Functions},
  booktitle = {Proceedings of the ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications},
  series = {OOPSLA '25},
  year = {2025},
  doi = {10.1145/3763086},
  publisher = {ACM},
  address = {Singapore}
}