- Zero to RAPIDS in Minutes with NVIDIA GPUs + Saturn Cloud - Sep 27, 2021.
Managing large-scale data science infrastructure presents significant challenges. With Saturn Cloud, managing GPU-based infrastructure is made easier, allowing practitioners and enterprises to focus on solving their business challenges.
GPU, NVIDIA, Python, Saturn Cloud
- Speeding up Neural Network Training With Multiple GPUs and Dask - Sep 14, 2021.
A common moment when training a neural network is when you realize the model isn’t training quickly enough on a CPU and you need to switch to using a GPU. It turns out multi-GPU model training across multiple machines is pretty easy with Dask. This blog post is about my first experiment in using multiple GPUs with Dask and the results.
Dask, GPU, Neural Networks, Training
GPU-Powered Data Science (NOT Deep Learning) with RAPIDS - Aug 2, 2021.
How to utilize the power of your GPU for regular data science and machine learning even if you do not do a lot of deep learning work.
Data Science, GPU, Python
Not Only for Deep Learning: How GPUs Accelerate Data Science & Data Analytics - Jul 26, 2021.
Modern AI/ML systems’ success has been critically dependent on their ability to process massive amounts of raw data in a parallel fashion using task-optimized hardware. Can we leverage the power of GPU and distributed computing for regular data processing jobs too?
Data Analytics, Data Science, Deep Learning, GPU
- How to Use NVIDIA GPU Accelerated Libraries - Jul 1, 2021.
If you are wondering how you can take advantage of NVIDIA GPU accelerated libraries for your AI projects, this guide will help answer questions and get you started on the right path.
GPU, NVIDIA, Programming
- Super Charge Python with Pandas on GPUs Using Saturn Cloud - May 12, 2021.
Saturn Cloud is a tool that allows you to have 10 hours of free GPU computing and 3 hours of Dask Cluster computing a month for free. In this tutorial, you will learn how to use these free resources to process data using Pandas on a GPU. The experiments show that Pandas is over 1,000,000% slower on a CPU as compared to running Pandas on a Dask cluster of GPUs.
Cloud, GPU, Pandas, Python
Good-bye Big Data. Hello, Massive Data! - Oct 22, 2020.
Join the Massive Data Revolution with SQream. Shorten query times from days to hours or minutes, and speed up data preparation with - analyze the raw data directly.
Big Data, GPU, SQream
- HOSTKEY GPU Grant Program - Aug 10, 2020.
The HOSTKEY GPU Grant Program is open to specialists and professionals in the Data Science sector performing research or other projects centered on innovative uses of GPU processing and which will glean practical results in the field of Data Science, with the objective of supporting basic scientific research and prospective startups.
Data Science, GPU, Research
- PyTorch Multi-GPU Metrics Library and More in New PyTorch Lightning Release - Jul 2, 2020.
PyTorch Lightning, a very light-weight structure for PyTorch, recently released version 0.8.1, a major milestone. With incredible user adoption and growth, they are continuing to build tools to easily do AI research.
GPU, Metrics, Python, PyTorch, PyTorch Lightning
A Complete guide to Google Colab for Deep Learning - Jun 16, 2020.
Google Colab is a widely popular cloud service for machine learning that features free access to GPU and TPU computing. Follow this detailed guide to help you get up and running fast to develop your next deep learning algorithms with Colab.
Deep Learning, GitHub, Google Colab, GPU, Jupyter
- Deep Learning Breakthrough: a sub-linear deep learning algorithm that does not need a GPU? - Mar 26, 2020.
Deep Learning sits at the forefront of many important advances underway in machine learning. With backpropagation being a primary training method, its computational inefficiencies require sophisticated hardware, such as GPUs. Learn about this recent breakthrough algorithmic advancement with improvements to the backpropgation calculations on a CPU that outperforms large neural network training with a GPU.
Algorithms, Deep Learning, GPU, Machine Learning
- GPU Accelerated Data Analytics & Machine Learning - Aug 2, 2019.
The future is here! Speed up your Machine Learning workflow using Python RAPIDS libraries support.
Analytics, GPU, Machine Learning, Python
- Easily Deploy Deep Learning Models in Production - Aug 1, 2019.
Getting trained neural networks to be deployed in applications and services can pose challenges for infrastructure managers. Challenges like multiple frameworks, underutilized infrastructure and lack of standard implementations can even cause AI projects to fail. This blog explores how to navigate these challenges.
Deep Learning, Deployment, GPU, Inference, NVIDIA
- Here’s how you can accelerate your Data Science on GPU - Jul 30, 2019.
Data Scientists need computing power. Whether you’re processing a big dataset with Pandas or running some computation on a massive matrix with Numpy, you’ll need a powerful machine to get the job done in a reasonable amount of time.
Big Data, Data Science, DBSCAN, Deep Learning, GPU, NVIDIA, Python
- Nvidia’s New Data Science Workstation — a Review and Benchmark - Jul 3, 2019.
Nvidia has recently released their Data Science Workstation, a PC that puts together all the Data Science hardware and software into one nice package. The workstation is a total powerhouse machine, packed with all the computing power — and software — that’s great for plowing through data.
Advice, Big Data, Deep Learning, GPU, NVIDIA
- Examining the Transformer Architecture – Part 2: A Brief Description of How Transformers Work - Jul 2, 2019.
As The Transformer may become the new NLP standard, this review explores its architecture along with a comparison to existing approaches by RNN.
BERT, Deep Learning, Exxact, GPU, NLP, Recurrent Neural Networks, Transfer Learning, Transformer
- Mastering Fast Gradient Boosting on Google Colaboratory with free GPU - Mar 19, 2019.
CatBoost is a fast implementation of GBDT with GPU support out-of-the-box. Google Colaboratory is a very useful tool with free GPU support.
CatBoost, Google Colab, GPU, Gradient Boosting, Machine Learning, Python, Yandex
- Top KDnuggets tweets, Jan 30 – Feb 05: state-of-the-art in #AI, #MachineLearning - Feb 6, 2019.
Also Brilliant tour-de-force! Reinforcement Learning to solve Rubiks Cube; Dask, Pandas, and GPUs: first steps; Neural network AI is simple. So Stop pretending you are a genius.
Dask, GPU, Pandas, Python, Reinforcement Learning, Top tweets
- XGBoost on GPUs: Unlocking Machine Learning Performance and Productivity - Dec 7, 2018.
On Dec 18, 11:00 AM PT, join NVIDIA for a technical deep dive into GPU-accelerated machine learning, to exploring the benefits of XGBoost on GPUs and much more.
GPU, Machine Learning, NVIDIA, XGBoost
- Best Deals in Deep Learning Cloud Providers: From CPU to GPU to TPU - Nov 15, 2018.
A detailed comparison of the best places to train your deep learning model for the lowest cost and hassle, including AWS, Google, Paperspace, vast.ai, and more.
Cloud Computing, Deep Learning, GPU, TPU
- Key Takeaways from AI Conference SF, Day 2: AI and Security, Adversarial Examples, Innovation - Oct 30, 2018.
Highlights and key takeaways from selected keynote sessions on day 2 of AI Conference San Francisco 2018.
Adversarial, AI, Architecture, GPU, O'Reilly, Privacy, San Francisco, TPU, Training Data
- Key Takeaways from AI Conference SF, Day 1: Domain Specific Architectures, Emerging China, AI Risks - Oct 29, 2018.
Highlights and key takeaways include Domain Specific Architectures – the next big thing, Emerging China – evolving from copying ideas to true innovation, and Addressing Risks in AI – Security, Privacy, and Ethics.
AGI, AI, Architecture, Big Data, China, GPU, O'Reilly, OpenAI, Risks, San Francisco
- Kinetica: Software Engineer (Python) [Arlington, VA] - Aug 21, 2018.
Work closely with the Product Owner to build out the product in Python and integrate all other parts (TensorFlow, Kubernetes, and our GPU-powered DB) using Python bindings to build and deliver an overall product (a REST API).
Arlington, Database, GPU, Kinetica, Python, Software Engineer, VA
- Kinetica: Sr. Software Engineer (Machine Learning) [Arlington, VA] - Aug 21, 2018.
Join an accomplished team to help build out a new scalable, distributed machine learning and data science platform with tight integrations and pipelines to a distributed, sharded GPU-powered database.
Database, Distributed Systems, GPU, Kinetica, Machine Learning, Software Engineer
- A Crash Course in MXNet Tensor Basics & Simple Automatic Differentiation - Aug 16, 2018.
This is an overview of some basic functionality of the MXNet ndarray package for creating tensor-like objects, and using the autograd package for performing automatic differentiation.
GPU, MXNet, Python, Tensor
PyTorch Tensor Basics - May 11, 2018.
This is an introduction to PyTorch's Tensor class, which is reasonably analogous to Numpy's ndarray, and which forms the basis for building neural networks in PyTorch.
GPU, Python, PyTorch, Tensor
- Comparing Deep Learning Frameworks: A Rosetta Stone Approach - Mar 26, 2018.
A Rosetta Stone of deep-learning frameworks has been created to allow data-scientists to easily leverage their expertise from one framework to another.
Caffe, CNTK, Deep Learning, GPU, Keras, Microsoft, MXNet, PyTorch, TensorFlow
- Score a Nvidia Titan V GPU at AnacondaCON 2018 - Mar 21, 2018.
At AnacondaCON 2018 in Austin, Apr 8-11, you'll learn how data scientists are using GPUs for machine learning across a variety of applications and industries. The best part? One lucky attendee will receive a FREE NVIDIA TITAN V GPU!
Anaconda, Austin, Data Science, GPU, Machine Learning, NVIDIA, TX
- For GPU Databases of today, the big challenge is doing JOINS - Mar 2, 2018.
While some GPU database problems have been solved, one challenge remains that only one vendor has tackled properly and that is fast SQL joins on GPU.
Brytlyt, Database, GPU, Postgres
- Fast.ai Lesson 1 on Google Colab (Free GPU) - Feb 8, 2018.
In this post, I will demonstrate how to use Google Colab for fastai. You can use GPU as a backend for free for 12 hours at a time. GPU compute for free? Are you kidding me?
Deep Learning, fast.ai, Google, Google Colab, GPU, Jupyter
- Top KDnuggets tweets, Jan 24-30: Top 10 Algorithms for Machine Learning Newbies; Want to Become a Data Scientist? Try Feynman Technique - Jan 31, 2018.
Also: Chronological List of AI Books To Read - from Goedel, Escher, Bach ... ; Aspiring Data Scientists! Start to learn Statistics with these 6 books.
Algorithms, Cloud Computing, Google, GPU, Top tweets
- Supercharging Visualization with Apache Arrow - Jan 5, 2018.
Interactive visualization of large datasets on the web has traditionally been impractical. Apache Arrow provides a new way to exchange and visualize data at unprecedented speed and scale.
Apache Arrow, Big Data, Data Analytics, Data Visualization, Dremio, GPU, Graphistry, Open Source
- NVIDIA DGX Systems – Deep Learning Software Whitepaper - Nov 20, 2017.
Download this whitepaper from NVIDIA DGX Systems, and gain insight into the engineering expertise and innovation found in pre-optimized deep learning frameworks available only on NVIDIA DGX Systems and learn how to dramatically reduce your engineering costs using today’s most popular frameworks.
Deep Learning, ebook, Free ebook, GPU, Neural Networks, NVIDIA
- 5 overriding factors for the successful implementation of AI - Oct 6, 2017.
Today AI is everywhere, from virtual assistants scheduling meetings, to facial recognition software and increasingly autonomous cars. We review 5 main factors for the successful AI implementation.
AI, Algorithms, GDPR, GPU, Humans vs Machines, Implementation, Open Data
- GPU-accelerated, In-database Analytics for Operationalizing AI - Oct 2, 2017.
This blog explores how the massive parallel processing power of the GPU is able to unify the entire AI pipeline on a single platform, and how this is both necessary and sufficient for overcoming the challenges to operationalizing AI.
AI, Analytics, GPU, In-Database, Kinetica, TensorFlow
- Tensorflow Tutorial, Part 2 – Getting Started - Sep 28, 2017.
This tutorial will lay a solid foundation to your understanding of Tensorflow, the leading Deep Learning platform. The second part shows how to get started, install, and build a small test case.
Deep Learning, GPU, Python, TensorFlow
- KDnuggets™ News 17:n32, Aug 23: The Rise of GPU Databases; Instagramming with Python for Data Analysis - Aug 23, 2017.
Also: Deep Learning and Neural Networks Primer; A New Beginning to Deep Learning; The most important step in Machine Learning process.
Databases, GPU, Instagram, Python
The Rise of GPU Databases - Aug 17, 2017.
The recent but noticeable shift from CPUs to GPUs is mainly due to the unique benefits they bring to sectors like AdTech, finance, telco, retail, or security/IT . We examine where GPU databases shine.
Big Data, Database, GPU, Predictive Analytics, SQL, SQream
Deep Learning – Past, Present, and Future - May 2, 2017.
There is a lot of buzz around deep learning technology. First developed in the 1940s, deep learning was meant to simulate neural networks found in brains, but in the last decade 3 key developments have unleashed its potential.
Pages: 1 2
Andrew Ng, Big Data, Deep Learning, Geoff Hinton, Google, GPU, History, Neural Networks, NVIDIA
- Top KDnuggets tweets, Mar 29 – Apr 04: Free Must-Read Books for #MachineLearning; #Apache Slug, new #BigData project - Apr 5, 2017.
Also Self-driving talent is fleeing Google and Uber to catch the autonomous-driving; Using Docker, CoreOS For #GPU Based #DeepLearning; A Short Guide to Navigating the Jupyter Ecosystem.
Docker, Free ebook, GPU, Self-Driving Car, Top tweets
- Data Science Deployments With Docker - Dec 1, 2016.
With the recent release of NVIDIA’s nvidia-docker tool, accessing GPUs from within Docker is a breeze. In this tutorial we’ll walk you through setting up nvidia-docker so you too can deploy machine learning models with ease.
Data Science, Docker, GPU, indico, NVIDIA
- Parallelism in Machine Learning: GPUs, CUDA, and Practical Applications - Nov 10, 2016.
The lack of parallel processing in machine learning tasks inhibits economy of performance, yet it may very well be worth the trouble. Read on for an introductory overview to GPU-based parallelism, the CUDA framework, and some thoughts on practical implementation.
Pages: 1 2
Algorithms, CUDA, GPU, NVIDIA, Parallelism
- NVIDIA: Deep Learning Library Software Development Engineer - Oct 20, 2016.
To researchers and companies are using GPUs to power a revolution in deep learning, enabling breakthroughs in problems from image classification to speech recognition to natural language processing. Join the team which is building software which will be used by the entire world.
CA, Deep Learning, GPU, NVIDIA, Santa Clara, Software Engineer
- Neural Designer: Predictive Analytics Software - Sep 26, 2016.
Neural Designer advanced neural network algorithms, combined with a simple user interface and fast performance, make it a great tool for data scientists. Download free 15-day trial version.
Classification, CUDA, Forecasting, GPU, Neural Networks, Predictive Analytics
- Up to Speed on Deep Learning: July Update - Aug 29, 2016.
Check out this thorough roundup of deep learning stories that made news in July. See if there are any items of note you missed.
Cats, Deep Learning, DeepMind, Google, GPU, Healthcare, Machine Learning
- How to Build Your Own Deep Learning Box - Jun 2, 2016.
Want to build an affordable deep learning box and get all the required software installed? Read on for a proper overview.
Pages: 1 2
CUDA, Deep Learning, GPU
- KDnuggets™ News 16:n13, Apr 13: Signs of a Bad Data Scientist; Deep Learning from 30,000 Feet; Analytics Survey - Apr 13, 2016.
10 Signs Of A Bad Data Scientist; Deep Learning from 30,000 feet; IDC/KDnuggets Advanced Analytics Survey - please participate; A Pocket Guide to Data Science; Basics of GPU Computing for Data Scientists
Data Science, Deep Learning, GPU, IDC, Survey
- Basics of GPU Computing for Data Scientists - Apr 7, 2016.
With the rise of neural network in data science, the demand for computationally extensive machines lead to GPUs. Learn how you can get started with GPUs & algorithms which could leverage them.
Algorithms, CUDA, Data Science, GPU, NVIDIA
- NVIDIA: Senior Data Mining Analyst - Aug 16, 2015.
Fill a key role in our Business Planning & Analytics Team, the analytic hub of NVIDIA product marketing organization.
CA, Data Mining Analyst, GPU, NVIDIA, Santa Clara
- Popular Deep Learning Tools – a review - Jun 18, 2015.
Deep Learning is the hottest trend now in AI and Machine Learning. We review the popular software for Deep Learning, including Caffe, Cuda-convnet, Deeplearning4j, Pylearn2, Theano, and Torch.
Convolutional Neural Networks, CUDA, Deep Learning, GPU, Pylearn2, Python, Ran Bi, Theano, Torch
- Top /r/MachineLearning Posts, Mar 8-14: Word vectors, Hardware for Deep Learning, and Neural Graphics Engines - Mar 19, 2015.
Word vectors in NLP, Machine Learning's place in programming, hardware for deep learning, Machine Learning interviews, and neural graphics engines are all topics covered this week on /r/MachineLearning.
Deep Learning, GPU, Graphics, Interview Questions, NLP, Reddit
- Top /r/MachineLearning Posts, Mar 1-7: Stanford Deep Learning for NLP, Machine Learning with Scikit-learn - Mar 9, 2015.
This week on /r/MachineLearning, we have a new NLP-focused deep learning course from Stanford, an introduction to scikit-learn, visualization of music collections, an implementation of DeepMind, and NLP using deep learning and Torch.
Deep Learning, DeepMind, Facebook, GPU, Python, Reddit, scikit-learn, Torch
- Top /r/MachineLearning Posts, Feb 22-28: Jurgen Schmidhuber AMA and Machine Learning Done Wrong - Mar 4, 2015.
The Jürgen Schmidhuber AMA begins taking questions, machine learning done wrong, GPUs for deep learning, Google opens its native MapReduce capabilities, and Google publishes its DeepMind paper this week on /r/MachineLearning
Deep Learning, DeepMind, GPU, Jurgen Schmidhuber, Machine Learning, Reddit
- Facebook Open Sources deep-learning modules for Torch - Feb 9, 2015.
We review Facebook recently released Torch module for Deep Learning, which helps researchers train large scale convolutional neural networks for image recognition, natural language processing and other AI applications.
Artificial Intelligence, Deep Learning, Facebook, GPU, Neural Networks, NYU, Ran Bi, Torch, Yann LeCun
- Associations and Text Mining of World Events - Sep 30, 2014.
Applying frequent itemset analysis to text may seem daunting, but parallel hardware and two insights open the door to theme extraction.
Association Rules, Chris Painter, GPU, Neo4j, Text Mining
- CuDNN – A new library for Deep Learning - Sep 19, 2014.
Becoming more and more popular, deep learning is proved to be useful in artificial intelligence. Last week, NVIDIA’s new library for deep neural networks, cuDNN, has attracted much attention.
Convolutional Neural Networks, Deep Learning, GPU, NVIDIA, Yann LeCun