A new paper from a multi-institutional research team proposes CW Networks, a message-passing method that delivers better expressivity than commonly used graph neural networks (GNNs) and achieves state-of-the-art results across a variety of molecular datasets.
The expressive power of GNNs mainly reflects their capability to distinguish whether two given graphs are isomorphic or not. Recent studies however have shown that GNNs struggle with long-range interactions and lack a principled way to model higher-order structures, limiting their expressive power.
In the paper Weisfeiler and Lehman Go Cellular: CW Networks, researchers from the University of Cambridge, Imperial College London & Twitter, UCLA, MPI-MIS, and SJTU & UNSW propose CW Networks (CWNs). The novel message-passing scheme enables principled modelling of higher-order signals and compression of distances between nodes. It operates on regular cell complexes and demonstrates stronger expressive power than GNNs.
The graph isomorphism problem is NP-intermediate, and the Weisfeiler-Lehman (WL) test, introduced in the 1968 paper A Reduction of a Graph to Canonical Form and an Algebra Arising During this Reduction, can be used to benchmark the expressive power of today’s GNNs.
The team first shows how to transform graphs into higher-dimensional cell complexes, performing colour refinement on the resulting cell complexes to simplify the evaluation of their isomorphism.
For their colour refinement scheme for cell complexes, the team chose Cellular WL (CWL), which generalizes the Simplicial WL and WL tests. The procedure for a single cell includes three steps: 1) Given a regular cell complex, all the cells are initialized with the same colour; 2) Given the colour of a cell at each iteration, the colour of the cell at the next iteration is computed by injectively mapping the multi-sets of colours belonging to the adjacent cells using a fixed HASH function; 3) The algorithm stops when a stable colouring is reached. Two cell complexes are considered non-isomorphic if their colour histograms are different — otherwise, the test is inconclusive. The researchers also show that for a wide range of transformations, CWL applied to cell complexes is more powerful than WL applied to the initial graphs.
The team further describes CWNs with an applied focus on molecular graphs. The cells in the proposed CWNs receive two types of messages: the first specifies messages from atoms to bonds and from bonds to ring; and the second specifies messages between atoms connected by a bond and messages between bonds that are part of the same ring. The team first shows that CWNs are at most as powerful as CWL; and that by reducing the overall number of layers needed in the presence of long-range node interaction, CWNs can solve the long-range interaction problems that hinder traditional GNNs. Moreover, CWNs can be seen as a (non-linear) generalization of linear diffusion operators on cell complexes whose computational complexity is linear in the size of the input complex, which is much less than GNNs.
The team employs a cell isomorphism network (CIN) model, which stacks CWN layers with local aggregators as in GIN. They tested their proposed model on eight TUDataset benchmarks with small and medium sizes from biology (PROTEINS), chemistry (i.e. molecules – MUTAG, PTC, NCI1 and NCI109) and social networks (IMDB-B, IMDB-M, RDT-B). They compared CIN to baseline methods RWK, GK (with k = 3), PK, WL kernel, DCNN, DGCNN, IGN, GIN, PPGNs, Natural GN, GSN, and SIN.
CIN achieved the best mean accuracy results on four out of eight datasets and ranked second on the others. The experiments show that the proposed CW Networks approach can obtain state-of-the-art results on popular large-scale molecular graph datasets and other related tasks, validating its powerful message-passing performance on cell complexes.
The paper Weisfeiler and Lehman Go Cellular: CW Networks is on arXiv.
Author: Hecate He | Editor: Michael Sarazen, Chain Zhang
We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.
Pingback: r/artificial - [R] New Study Proposes CW Networks: Greater Expressive Power Than GNNs - Cyber Bharat
Pingback: Greater Expressive Power Than GNNs : MachineLearning - TechFlx