- OpenAI’s Approach to Solve Math Word Problems - Nov 9, 2021.
OpenAI's latest research aims to solve math word problems. Let's dive a bit deeper into the ideas behind this new research.
GPT-3, Mathematics, NLP, OpenAI
- Exclusive: OpenAI summarizes KDnuggets - Oct 23, 2021.
OpenAI has recently done amazing work summarizing full-length books. We have asked OpenAI to summarize two recent KDnuggets posts, and the results have a very human-like quality. Only the last line betrays the inhuman intelligence at work.
About KDnuggets, OpenAI, Summarization
- Scaling human oversight of AI systems for difficult tasks – OpenAI approach - Oct 11, 2021.
The foundational idea of Artificial Intelligence is that it should demonstrate human-level intelligence. So, unless a model can perform as a human might do, its intended purpose is missed. Here, recent OpenAI research into full-length book summarization focuses on generating results that make sense to humans with state-of-the-art results that leverage scalable AI-enhanced human-in-the-loop feedback.
AGI, GPT-3, NLP, OpenAI, Summarization, Text Summarization
- An Introduction to Reinforcement Learning with OpenAI Gym, RLlib, and Google Colab - Sep 14, 2021.
Get an Introduction to Reinforcement Learning by attempting to balance a virtual CartPole with OpenAI Gym, RLlib, and Google Colab.
Google Colab, OpenAI, Python, Reinforcement Learning
- Multilingual CLIP with Huggingface + PyTorch Lightning - Mar 26, 2021.
An overview of training OpenAI's CLIP on Google Colab.
CLIP, Google Colab, Hugging Face, Image Recognition, NLP, OpenAI, PyTorch, PyTorch Lightning
- GPT-2 vs GPT-3: The OpenAI Showdown - Feb 17, 2021.
Thanks to the diversity of the dataset used in the training process, we can obtain adequate text generation for text from a variety of domains. GPT-2 is 10x the parameters and 10x the data of its predecessor GPT.
GPT-2, GPT-3, Natural Language Generation, NLP, OpenAI, Transformer
- OpenAI Releases Two Transformer Models that Magically Link Language and Computer Vision - Jan 11, 2021.
OpenAI has released two new transformer architectures that combine image and language tasks in an fun and almost magical way. Read more about them here.
Computer Vision, NLP, OpenAI, Transformer
- Compute Goes Brrr: Revisiting Sutton’s Bitter Lesson for AI - Nov 19, 2020.
"It's just about having more compute." Wait, is that really all there is to AI? As Richard Sutton's 'bitter lesson' sinks in for more AI researchers, a debate has stirred that considers a potentially more subtle relationship between advancements in AI based on ever-more-clever algorithms and massively scaled computational power.
AI, AlphaGo, Machine Learning, OpenAI, Richard Sutton, Scalability, Trends
- Can AI Learn Human Values? - Oct 27, 2020.
OpenAI believes that the path to safe AI requires social sciences.
AI, Bias, Ethics, OpenAI
- A Curious Theory About the Consciousness Debate in AI - Aug 31, 2020.
Dr. Michio Kaku has formulated a very interesting theory of consciousness that applies to AI systems.
Agents, AI, DeepMind, OpenAI
- Must-read NLP and Deep Learning articles for Data Scientists - Aug 21, 2020.
NLP and deep learning continue to advance, nearly on a daily basis. Check out these recent must-read guides, feature articles, and other resources to keep you on top of the latest advancements and ahead of the curve.
Deep Learning, Google, GPT-3, NLP, OpenAI, Privacy, Research, Self-Driving, TensorFlow, Trends
- Exploring GPT-3: A New Breakthrough in Language Generation - Aug 10, 2020.
GPT-3 is the largest natural language processing (NLP) transformer released to date, eclipsing the previous record, Microsoft Research’s Turing-NLG at 17B parameters, by about 10 times. This has resulted in an explosion of demos: some good, some bad, all interesting.
GPT-3, Natural Language Generation, NLP, OpenAI, Turing Test
- GPT-3, a giant step for Deep Learning and NLP? - Jun 9, 2020.
Recently, OpenAI announced a new successor to their language model, GPT-3, that is now the largest model trained so far with 175 billion parameters. Training a language model this large has its merits and limitations, so this article covers some of its most interesting and important aspects.
AI, Deep Learning, GPT-2, GPT-3, NLP, OpenAI
- The Double Descent Hypothesis: How Bigger Models and More Data Can Hurt Performance - Apr 20, 2020.
OpenAI research shows a phenomenon that challenges both traditional statistical learning theory and conventional wisdom in machine learning practitioners.
Deep Learning, Modeling, OpenAI
- OpenAI Open Sources Microscope and the Lucid Library to Visualize Neurons in Deep Neural Networks - Apr 17, 2020.
The new tools shows the potential of data visualizations for understanding features in a neural network.
Neural Networks, Open Source, OpenAI, Visualization
- OpenAI is Adopting PyTorch… They Aren’t Alone - Jan 31, 2020.
OpenAI is moving to PyTorch for the bulk of their research work. This might be a high-profile adoption, but it is far from the only such example.
Adoption, AI, Deep Learning, OpenAI, PyTorch
- OpenAI Tried to Train AI Agents to Play Hide-And-Seek but Instead They Were Shocked by What They Learned - Oct 7, 2019.
OpenAI trained agents in a simple game of hide-and-seek and learned many other different skills in the process.
AI, OpenAI, Reinforcement Learning
- Scaling a Massive State-of-the-art Deep Learning Model in Production - Jul 15, 2019.
A new NLP text writing app based on OpenAI's GPT-2 aims to write with you -- whenever you ask. Find out how the developers setup and deployed their model into production from an engineer working on the team.
Deep Learning, Deployment, NLP, OpenAI, Scalability, Transformer
- Examining the Transformer Architecture: The OpenAI GPT-2 Controversy  - Jun 20, 2019.
GPT-2 is a generative model, created by OpenAI, trained on 40GB of Internet to predict the next word. And OpenAI found this model to be SO good that they did not release the fully trained model  due to their concerns about malicious applications of the technology.
AI, Architecture, GPT-2, NLP, OpenAI, Transformer
- OpenAI’s GPT-2: the model, the hype, and the controversy - Mar 4, 2019.
OpenAI recently released a very large language model called GPT-2. Controversially, they decided not to release the data or the parameters of their biggest model, citing concerns about potential abuse. Read this researcher's take on the issue.
AI, Ethics, GPT-2, Hype, NLP, OpenAI
- Eat Melon: A Deep Q Reinforcement Learning Demo in your browser - Jan 20, 2017.
Check "Eat Melon demo", a fun way to gain familiarity with the Deep Q Learning algorithm, which you can do in your browser.
Atari, Deep Learning, OpenAI, Reinforcement Learning