OOPSLA'25 VerificationActivation FunctionsConvex HullFormal Methods

WraAct: Convex Hull Approximation for General Activation Functions

Efficiently constructing tight over-approximations for activation function hulls, supporting Sigmoid, Tanh, MaxPool and more with 400X faster efficiency.

Overview

WraAct is an approach to efficiently constructing tight over-approximations for activation function hulls. Published at OOPSLA’25 (the ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications) within SPLASH’25.

Key Idea

WraAct’s core idea is to introduce linear constraints to smooth out the fluctuations in the target function, by leveraging double-linear-piece (DLP) functions to simplify the local geometry. In this way, the problem is reduced to over-approximating DLP functions, which can be efficiently handled.

Supported Activation Functions

  • ReLU — Rectified Linear Unit
  • Sigmoid — Logistic sigmoid function
  • Tanh — Hyperbolic tangent
  • MaxPool — Max pooling operations
  • GeLU — Gaussian Error Linear Unit
  • And more general activation functions

Performance Highlights

  • 400X faster than SOTA multi-neuron over-approximation (SBLM+PDDM)
  • 150X better precision on average
  • 50% fewer constraints on average
  • Handles up to 8 input dimensions in under 10 seconds
  • Verified large networks like ResNets with 22k neurons in under one minute per sample

Verification Impact

On 100 benchmark samples:

  • Enhances single-neuron verification from under 10 to over 40 verified samples
  • Outperforms multi-neuron verifier PRIMA with up to 20 additional verified samples

Comparison with WraLU

FeatureWraLU (POPL’24)WraAct (OOPSLA’25)
ActivationReLU onlyGeneral (Sigmoid, Tanh, MaxPool, etc.)
ApproachWrapping polytopeDLP-based smoothing
Multi-neuronYesYes
ScalabilityUp to ~4D efficientlyUp to 8D in 10s
  • WraLU: ReLU-specific hull approximation (POPL’24)
  • wraact (library): Unified Python library for hull approximation
  • propdag: Bound propagation framework

Citation

@article{10.1145/3763086,
  author = {Ma, Zhongkui and Wang, Zihan and Bai, Guangdong},
  title = {Convex Hull Approximation for Activation Functions},
  year = {2025},
  issue_date = {October 2025},
  publisher = {Association for Computing Machinery},
  volume = {9},
  number = {OOPSLA2},
  url = {https://doi.org/10.1145/3763086},
  doi = {10.1145/3763086},
  month = oct,
  articleno = {308},
  numpages = {27}
}