- High-Performance Deep Learning: How to train smaller, faster, and better models – Part 5 - Jul 16, 2021.
Training efficient deep learning models with any software tool is nothing without an infrastructure of robust and performant compute power. Here, current software and hardware ecosystems are reviewed that you might consider in your development when the highest performance possible is needed.
Deep Learning, Efficiency, Google, Hardware, Machine Learning, NVIDIA, PyTorch, Scalability, TensorFlow
- How to Use NVIDIA GPU Accelerated Libraries - Jul 1, 2021.
If you are wondering how you can take advantage of NVIDIA GPU accelerated libraries for your AI projects, this guide will help answer questions and get you started on the right path.
GPU, NVIDIA, Programming
- Easily Deploy Deep Learning Models in Production - Aug 1, 2019.
Getting trained neural networks to be deployed in applications and services can pose challenges for infrastructure managers. Challenges like multiple frameworks, underutilized infrastructure and lack of standard implementations can even cause AI projects to fail. This blog explores how to navigate these challenges.
Deep Learning, Deployment, GPU, Inference, NVIDIA
- Here’s how you can accelerate your Data Science on GPU - Jul 30, 2019.
Data Scientists need computing power. Whether you’re processing a big dataset with Pandas or running some computation on a massive matrix with Numpy, you’ll need a powerful machine to get the job done in a reasonable amount of time.
Big Data, Data Science, DBSCAN, Deep Learning, GPU, NVIDIA, Python
- Generative Adversarial Networks – Key Milestones and State of the Art - Apr 24, 2019.
We provide an overview of Generative Adversarial Networks (GANs), discuss challenges in GANs learning, and examine two promising GANs: the RadialGAN, designed for numbers, and the StyleGAN, which does style transfer for images.
GANs, Generative Adversarial Network, NVIDIA
- Which Face is Real? - Apr 2, 2019.
Which Face Is Real? was developed based on Generative Adversarial Networks as a web application in which users can select which image they believe is a true person and which was synthetically generated. The person in the synthetically generated photo does not exist.
Deep Learning, GANs, Generative Adversarial Network, Neural Networks, NVIDIA, Python
- Top 10 Recent AI videos on YouTube - May 10, 2017.
Top viewed videos on artificial intelligence since 2016 include great talks and lecture series from MIT and Caltech, Google Tech Talks on AI.
AI, Google, Machine Learning, MIT, Neural Networks, NVIDIA, Robots, Youtube
- Deep Learning – Past, Present, and Future - May 2, 2017.
There is a lot of buzz around deep learning technology. First developed in the 1940s, deep learning was meant to simulate neural networks found in brains, but in the last decade 3 key developments have unleashed its potential.
Pages: 1 2
Andrew Ng, Big Data, Deep Learning, Geoff Hinton, Google, GPU, History, Neural Networks, NVIDIA
- Parallelism in Machine Learning: GPUs, CUDA, and Practical Applications - Nov 10, 2016.
The lack of parallel processing in machine learning tasks inhibits economy of performance, yet it may very well be worth the trouble. Read on for an introductory overview to GPU-based parallelism, the CUDA framework, and some thoughts on practical implementation.
Pages: 1 2
Algorithms, CUDA, GPU, NVIDIA, Parallelism
- Basics of GPU Computing for Data Scientists - Apr 7, 2016.
With the rise of neural network in data science, the demand for computationally extensive machines lead to GPUs. Learn how you can get started with GPUs & algorithms which could leverage them.
Algorithms, CUDA, Data Science, GPU, NVIDIA