- OpenAI’s Approach to Solve Math Word Problems - Nov 9, 2021.
OpenAI's latest research aims to solve math word problems. Let's dive a bit deeper into the ideas behind this new research.
GPT-3, Mathematics, NLP, OpenAI
- Scaling human oversight of AI systems for difficult tasks – OpenAI approach - Oct 11, 2021.
The foundational idea of Artificial Intelligence is that it should demonstrate human-level intelligence. So, unless a model can perform as a human might do, its intended purpose is missed. Here, recent OpenAI research into full-length book summarization focuses on generating results that make sense to humans with state-of-the-art results that leverage scalable AI-enhanced human-in-the-loop feedback.
AGI, GPT-3, NLP, OpenAI, Summarization, Text Summarization
- Surpassing Trillion Parameters and GPT-3 with Switch Transformers – a path to AGI? - Oct 1, 2021.
Ever larger models churning on increasingly faster machines suggest a potential path toward smarter AI, such as with the massive GPT-3 language model. However, new, more lean, approaches are being conceived and explored that may rival these super-models, which could lead to a future with more efficient implementations of advanced AI-driven systems.
AGI, Deep Learning, GPT-3, Transformer
- Jurassic-1 Language Models and AI21 Studio - Aug 23, 2021.
AI21 Labs’ new developer platform offers instant access to our 178B-parameter language model, to help you build sophisticated text-based AI applications at scale.
AI, GPT-3, NLP
- GPT-2 vs GPT-3: The OpenAI Showdown - Feb 17, 2021.
Thanks to the diversity of the dataset used in the training process, we can obtain adequate text generation for text from a variety of domains. GPT-2 is 10x the parameters and 10x the data of its predecessor GPT.
GPT-2, GPT-3, Natural Language Generation, NLP, OpenAI, Transformer
- Six Times Bigger than GPT-3: Inside Google’s TRILLION Parameter Switch Transformer Model - Jan 25, 2021.
Google’s Switch Transformer model could be the next breakthrough in this area of deep learning.
Google, GPT-3, NLP, Transformer
- Main 2020 Developments and Key 2021 Trends in AI, Data Science, Machine Learning Technology - Dec 9, 2020.
Our panel of leading experts reviews 2020 main developments and examines the key trends in AI, Data Science, Machine Learning, and Deep Learning Technology.
2021 Predictions, AI, AutoML, Bill Schmarzo, Carla Gentry, COVID-19, Doug Laney, GPT-3, Kirk D. Borne, Machine Learning, MLOps, Predictions, Ronald van Loon, Tom Davenport, Trends
- Must-read NLP and Deep Learning articles for Data Scientists - Aug 21, 2020.
NLP and deep learning continue to advance, nearly on a daily basis. Check out these recent must-read guides, feature articles, and other resources to keep you on top of the latest advancements and ahead of the curve.
Deep Learning, Google, GPT-3, NLP, OpenAI, Privacy, Research, Self-Driving, TensorFlow, Trends
- KDnuggets™ News 20:n31, Aug 12: Data Science Skills: Have vs Want: Vote in the New Poll; Netflix Polynote is a New Open Source Framework to Build Better Data Science Notebooks - Aug 12, 2020.
Vote in the
new KDnuggets poll: which data science skills you have and which ones you want? Netflix is not only for movies - its Polynote is a new open source framework to build better data science notebooks; Learn about containerization of PySpark using Kubernetes; Read the findings from Data Scientist Job Market 2020 analysis; and Explore GPT-3 latest.
Data Science Skills, GPT-3, Jobs, Kubernetes, Netflix, Poll, PySpark
- Exploring GPT-3: A New Breakthrough in Language Generation - Aug 10, 2020.
GPT-3 is the largest natural language processing (NLP) transformer released to date, eclipsing the previous record, Microsoft Research’s Turing-NLG at 17B parameters, by about 10 times. This has resulted in an explosion of demos: some good, some bad, all interesting.
GPT-3, Natural Language Generation, NLP, OpenAI, Turing Test
- GPT-3, a giant step for Deep Learning and NLP? - Jun 9, 2020.
Recently, OpenAI announced a new successor to their language model, GPT-3, that is now the largest model trained so far with 175 billion parameters. Training a language model this large has its merits and limitations, so this article covers some of its most interesting and important aspects.
AI, Deep Learning, GPT-2, GPT-3, NLP, OpenAI