TinySpeech: Novel Attention Condensers Enable Deep Recognition Networks on Edge Devices
Novel attention condensers designed to enable the building of low-footprint, highly-efficient deep neural networks for on-device speech recognition on the edge.
AI Technology & Industry Review
Novel attention condensers designed to enable the building of low-footprint, highly-efficient deep neural networks for on-device speech recognition on the edge.
Recently, Facebook AI Research (FAIR) researchers introduced a structured memory layer which can be easily integrated into a neural network to greatly expand network capacity and the number of parameters without significantly changing calculation cost.
Enter DarwinAI, a Waterloo, Ontario based AI startup which recently released a beta version of an automated machine learning solution it says can generate models ten times more efficiently than comparable state-of-the-art solutions.
The university launched the Waterloo Artificial Intelligence Institute this April, pooling 8 affiliated research centers, 23 labs, and some 100 faculty members into a new Canadian AI supercluster with a focus on operational AI research.
Canada is determined to build AI superclusters in Toronto-Waterloo, Montréal, and Edmonton.
computer science graduate student named Kaheer Suleman founded a company called Maluuba, using an intelligent program he invented as the product.