For Its Latest Trick, OpenAI’s GPT-3 Generates Images From Text Captions
OpenAI has trained a neural network called DALL·E that creates images from text captions for a wide range of concepts expressible in natural language.
AI Technology & Industry Review
OpenAI has trained a neural network called DALL·E that creates images from text captions for a wide range of concepts expressible in natural language.
In a new paper, researchers from Google, OpenAI, and DeepMind introduce “behaviour priors,” a framework designed to capture common movement and interaction patterns that are shared across a set of related tasks or contexts.
Microsoft announced today that it has teamed up with OpenAI to exclusively license the AI research institute’s GPT-3 language model.
OpenAI sets out to advance methods for training large-scale language models on objectives that more closely capture human preferences.
OpenAI researchers introduce GPT-f, an automated prover and proof assistant for the Metamath formalization language.
Although OpenAI hasn’t yet officially announced the GPT-3 pricing scheme, Branwen’s sneak peek has piqued the interest of the NLP community.
OpenAI’s 175 billion parameter language model GPT-3 has gone viral once again.
Large transformer-based language models trained on pixel sequences can generate coherent images without the use of labels.
OpenAI announced the upgraded GPT-3 with a whopping 175 billion parameters.
The 48-hour digital event kicked off yesterday, and the company wasted no time making impactful announcements that included a new supercomputer, a family of large AI models, and a Responsible ML on Microsoft Azure initiative.
Just as biologists gain insights into organisms by putting model specimens under their microscopes, AI Microscope was designed to help researchers analyze the features that form inside leading CV models.
Synced Global AI Weekly February 2nd
Every Friday Synced selects seven recent studies that present topical, innovative or otherwise interesting or important research we believe may be of interest to our readers.
As global AI development and deployment continues, the demand for AI talents is growing faster than ever. A number of industry leaders and reputable institutions offer AI residency programs designed to help nurture promising AI talents.
In a blog post today OpenAI today announced the final staged release of its 1.5 billion parameter language model GPT-2, along with all associated code and model weights.
We’ve lost the brain race, but humans still have unmatched dexterity, right? Wrong. OpenAI’s humanlike five-fingered gripper Dactyl just single-handedly solved a Rubik’s cube.
A new paper from San Francisco-based OpenAI proposes training models in the children’s game of hide-and-seek and pitting them against each other.
To ramp up the robustness of neural networks, researchers from OpenAI have introduced a novel method that evaluates how well a neural network classifier performs against adversarial attacks that were not seen during their training.
While it is exhilarating to see AI researchers pushing the performance of cutting-edge models to new heights, the costs of such processes are also rising at a dizzying rate.
San Francisco research company OpenAI has developed Sparse Transformer, a deep neural network which outperforms current state-of-the-art techniques for predicting long-sequence data in text, image and sound.
Synced Global AI Weekly April 21st
After eight-months of development efforts, the “Open AI Five” exacted their revenge today against one of the world’s top teams in a highly anticipated best-of-three 5v5 Dota 2 showdown in San Francisco.
Synced Global AI Weekly March 31st
10 AI News You Must Know From March W1 – W2
Synced Global AI Weekly March 17th
In a move that has surprised many, OpenAI today announced the creation of a new for-profit company to balance its huge expenditures into compute and AI talents. Sam Altman, the former president of Y Combinator who stepped down last week, has been named CEO of the new “capped-profit” company, OpenAI LP.
Synced Global AI Weekly February 24th
Github developer Hugging Face has updated its repository with a PyTorch reimplementation of the GPT-2 language model small version that OpenAI open-sourced last week, along with pretrained models and fine-tuning examples.
Synced Global AI Weekly February 17th
The San Francisco-based AI non-profit however has raised eyebrows in the research community with its unusual decision to not release the language model’s code and training dataset. In a statement sent to Synced, OpenAI explained the choice was made to prevent malicious use: “it’s clear that the ability to generate synthetic text that is conditioned on specific subjects has the potential for significant abuse.”
Synced Global AI Weekly December 9th
Synced Global AI Weekly Nov 18th
Artificial general intelligence (AGI) is the long-range, human-intelligence-level target of contemporary AI researchers worldwide. It’s believed AGI has the potential to meet basic human needs globally, end poverty, cure diseases, extend life, and even mitigate climate change. In short, AGI is the tech that could not only save the world, but build a utopia.
Synced surveyed a number of 2019 AI residency programs that may be of interest to readers.
In conjunction with yesterday’s release of open source AI software framework PyTorch 1.0, leading deep learning course developer Fast.ai has announced its first open source library for deep learning — fastai v1.
Nadja Rhodes is enamoured with artificial intelligence. A Seattle-based Microsoft software developer unpracticed in AI techniques such as deep learning, Rhodes had applied to a number of tech company sponsored AI residency initiatives, but to no avail. And so she was thrilled to be accepted by OpenAI Scholars.
A hearty round of applause arose from the crowd packing the Vancouver Rogers Centre on August 22 when a team of unassuming scientists wearing “OpenAI” T-shirts climbed up on stage.
Last August at the Dota 2 International tournament in Seattle, OpenAI introduced an AI bot that upset the world’s top 1v1 human player. The San Francisco-based AI research institute is now at the International 2018 in Vancouver, where their team of state-of-the-art bots is battling professional human teams in a highly anticipated best-of-three 5v5 Dota 2 showdown.
Compute Scoreboard:
AlexNet 1
AlphaZero 300,000