In the new paper Exphormer: Sparse Transformers for Graphs, a team from the University of British Columbia, Google Research and the Alberta Machine Intelligence Institute proposes Exphormer, a class of graph transformers with improved scalability and reduced computational complexity that achieves state-of-the-art performance on graph benchmarks.
In the new paper GraphCast: Learning Skillful Medium-Range Global Weather Forecasting, a research team from DeepMind and Google presents GraphCast, a machine-learning (ML)-based weather simulator that scales well with data and can generate a 10-day forecast in under 60 seconds. GraphCast outperforms the world’s most accurate deterministic operational medium-range weather forecasting system and betters existing ML-based benchmarks.
In the new paper TF-GNN: Graph Neural Networks in TensorFlow, a research team from Google Core ML, Google Research, and DeepMind open-sources the TensorFlow GNN (TF-GNN) scalable library, which leverages heterogeneous relational data to create graph neural network models.
A research team from Yale and IBM presents Kernel Graph Neural Networks (KerGNNs), which integrate graph kernels into the message passing process of GNNs in one framework, achieving performance comparable to state-of-the-art methods and significantly improving model interpretability compared with conventional GNNs.
A research team from University of Cambridge, Imperial College London & Twitter, UCLA, MPI-MIS, and SJTU & UNSW proposes CW Networks (CWNs), a message-passing scheme that operates on regular cell complexes and achieves stronger expressive power than graph neural networks (GNNs).