Zhongkui Ma
PhD Student @ The University of Queensland | Trustworthy AI · Formal Verification · Neural Network Robustness
Hi there! I'm Zhongkui Ma (马中奎), currently diving deep into my PhD journey at
the University of Queensland.
I'm fortunate to be supervised by A/Prof.
Guangdong Bai.
News¶
New!!! 🎉 🎉 🎉
Nov. 2025
Paper Accepted
Our paper Mitigating Gradient Inversion Risks in Language Models via Token Obfuscation is accepted by AsiaCCS'2026. Congrats, Xinguo!
Oct. 2025
Presentation
I present our work Convex Hull Approximation for Activation Functions at OOPSLA'25 in Singapore (Thu 16 Oct 2025 16:00-16:15 at Orchid West of Marina Bay Sands Convention Centre).
Aug. 2025
Paper Accepted
Our paper Convex Hull Approximation for Activation Functions is accepted by OOPSLA'25 within SPLASH'25. Happy!
Jan. 2025
Paper Accepted
Our paper AI Model Modulation with Logits Redistribution is accepted by WWW'25. Congrats, Zihan!
Dec. 2024
Tutorial
Prof. Bai and I have a tutorial on Robustness Verification of Neural Networks using WraLU at ADC'24 (Tue 16 Dec 2024, Gold Coast, Australia). We are awarded the Distinguished Tutorial Speaker!
Aug. 2024
Paper Accepted
Our paper Uncovering Gradient Inversion Risks in Practical Language Model Training is accepted by CCS'24. Congrats, Xinguo!
May. 2024
Presentation
I will have a presentation about ReLU Hull Approximation in the workshop Formal Methods in Australia and New Zealand during 29--30 May 2024 at the University of Queensland, St Lucia. My presentation time is 15:30--16:00 on 30 May 2024.
Apr. 2024
Presentation
I will present our work ReLU Hull Approximation in FPBench community monthly meeting on May 2nd at 9:00-10:00 AM Pacific time.
Apr. 2024
Paper Accepted
Our paper CORELOCKER: Neuron-level Usage Control is accepted by S&P'24. Congrats, Zihan! [Live Video]
Jan. 2024
Presentation
I present our work, ReLU Hull Approximation at POPL'24 in Kelvin Room of the Institution of Engineering and Technology (IET) at London, UK. [Live Video]
Nov. 2023
Presentation
I participate in ICFEM'23 at Brisbane, Australia. And I present our work, Formalizing Robustness Against Character-Level Perturbations for Neural Network Language Models and my doctoral symposium paper, Verifying Neural Networks by Approximating Convex Hulls.
Guides & Tutorials¶
I maintain the Neural Network Verification (NNV) Guides, a comprehensive collection covering foundational concepts to cutting-edge research techniques. The guides include in-depth technical content designed for practitioners and researchers interested in formal verification, robustness testing, and trustworthy AI.
📝 New Blogs
Jan. 2026
PropDAG, a template framework for bound propagation that eliminates infrastructure boilerplate in verification research.
Jan. 2026
ShapeONNX, explaining how dual-track shape representation solves ONNX's dynamic shape inference problem for verification tools.
Jan. 2026
SlimONNX, covering ONNX optimization for verification workflows, operator fusion challenges, and design decisions.
Jan. 2026
TorchONNX, a 6-stage compiler that converts ONNX models to native PyTorch with zero runtime overhead and vmap compatibility.
Jan. 2026
TorchVNNLIB, achieving 10-100x faster property loading for neural network verification with binary formats.
📚 Featured Guides
Phase 1
Neural Network Verification
Learn neural network verification in 3 minutes: core concepts and quick overview
Phase 1
Verification Problem
Mathematical formulation of neural network verification and certification
Phase 2
Bound Propagation
Core verification method using Interval Bound Propagation (IBP) and abstract interpretation
My Recent GitHub Repositories¶
I’m currently working on some exciting tools that I’m thrilled to share with you:
🎯 Featured Verification Tools
Fast and precise ReLU hull approximation (POPL'24). Achieves 10X-10⁶X speedup with 50% fewer constraints.
Convex hull approximation for general activation functions (OOPSLA'25). Supports Sigmoid, Tanh, MaxPool and more with 400X faster efficiency.
🔧 Supporting Libraries
A unified Python library to approximate activation function hull with convex polytopes. Regularly maintained and updated repo for the core algorithm in WraLU and WraAct, supporting various activation functions like ReLU, Sigmoid, Tanh, and GeLU.
A tool to infer the shape of an ONNX model when the official tool is down. Simple yet powerful tool that helps you understand the dimensions of your model's inputs and outputs.
A tool to optimize and simplify your ONNX models by removing redundant operations and resolving version issues. Makes ONNX files cleaner, more efficient, and ready for action!
A tool for converting ONNX models to PyTorch models (`.pth` for parameters, `.py` for structure). Simple, lightweight, and designed for seamless model conversion.
A tool to convert VNN-LIB files (`.vnnlib`) to PyTorch tensors (`.pth` files) for efficient neural network verification. Take full advantage of the PyTorch ecosystem.
A bound propagation framework for neural network verification. Supports any DAG (Directed Acyclic Graph) structure, covering both feedforward and backward propagation patterns for verification. Allows researchers to focus on their algorithms without worrying about complex computation graphs.
The above map shows the distribution of visitors to this website.
This widget is provided by ClustrMaps.
This widget is provided by ClustrMaps.