- What Are NVIDIA NGC Containers & How to Get Started Using Them - Nov 15, 2021.
NVIDIA, the pioneer in the GPU technologies and deep learning revolution, has come up with an excellent catalog of specialized containers that they call NGC Collections. In this article, we explore their basic usage and some variations.
Containers, Data Engineering, Deep Learning, NVIDIA
- Zero to RAPIDS in Minutes with NVIDIA GPUs + Saturn Cloud - Sep 27, 2021.
Managing large-scale data science infrastructure presents significant challenges. With Saturn Cloud, managing GPU-based infrastructure is made easier, allowing practitioners and enterprises to focus on solving their business challenges.
GPU, NVIDIA, Python, Saturn Cloud
- High-Performance Deep Learning: How to train smaller, faster, and better models – Part 5 - Jul 16, 2021.
Training efficient deep learning models with any software tool is nothing without an infrastructure of robust and performant compute power. Here, current software and hardware ecosystems are reviewed that you might consider in your development when the highest performance possible is needed.
Deep Learning, Efficiency, Google, Hardware, Machine Learning, NVIDIA, PyTorch, Scalability, TensorFlow
- How to Use NVIDIA GPU Accelerated Libraries - Jul 1, 2021.
If you are wondering how you can take advantage of NVIDIA GPU accelerated libraries for your AI projects, this guide will help answer questions and get you started on the right path.
GPU, NVIDIA, Programming
- Easily Deploy Deep Learning Models in Production - Aug 1, 2019.
Getting trained neural networks to be deployed in applications and services can pose challenges for infrastructure managers. Challenges like multiple frameworks, underutilized infrastructure and lack of standard implementations can even cause AI projects to fail. This blog explores how to navigate these challenges.
Deep Learning, Deployment, GPU, Inference, NVIDIA
- Here’s how you can accelerate your Data Science on GPU - Jul 30, 2019.
Data Scientists need computing power. Whether you’re processing a big dataset with Pandas or running some computation on a massive matrix with Numpy, you’ll need a powerful machine to get the job done in a reasonable amount of time.
Big Data, Data Science, DBSCAN, Deep Learning, GPU, NVIDIA, Python
- KDnuggets™ News 19:n25, Jul 10: 5 Probability Distributions for Data Scientists; What the Machine Learning Engineer Job is Really Like - Jul 10, 2019.
This edition of the KDnuggets newsletter is double-sized after taking the holiday week off. Learn about probability distributions every data scientist should know, what the machine learning engineering job is like, making the most money with the least amount of risk, the difference between NLP and NLU, get a take on Nvidia's new data science workstation, and much, much more.
Data Science, Data Scientist, Distribution, Machine Learning, Machine Learning Engineer, NLP, NVIDIA, Probability, Risk Modeling
- Nvidia’s New Data Science Workstation — a Review and Benchmark - Jul 3, 2019.
Nvidia has recently released their Data Science Workstation, a PC that puts together all the Data Science hardware and software into one nice package. The workstation is a total powerhouse machine, packed with all the computing power — and software — that’s great for plowing through data.
Advice, Big Data, Deep Learning, GPU, NVIDIA
- Generative Adversarial Networks – Key Milestones and State of the Art - Apr 24, 2019.
We provide an overview of Generative Adversarial Networks (GANs), discuss challenges in GANs learning, and examine two promising GANs: the RadialGAN, designed for numbers, and the StyleGAN, which does style transfer for images.
GANs, Generative Adversarial Network, NVIDIA
- Which Face is Real? - Apr 2, 2019.
Which Face Is Real? was developed based on Generative Adversarial Networks as a web application in which users can select which image they believe is a true person and which was synthetically generated. The person in the synthetically generated photo does not exist.
Deep Learning, GANs, Generative Adversarial Network, Neural Networks, NVIDIA, Python
- XGBoost on GPUs: Unlocking Machine Learning Performance and Productivity - Dec 7, 2018.
On Dec 18, 11:00 AM PT, join NVIDIA for a technical deep dive into GPU-accelerated machine learning, to exploring the benefits of XGBoost on GPUs and much more.
GPU, Machine Learning, NVIDIA, XGBoost
- Deep Learning: The Impact of NVIDIA DGX Station - Sep 25, 2018.
Read this IDC report & see how a deep learning workstation may solve IT problems of many researchers, developers, and creative professionals.
Deep Learning, IDC, NVIDIA
- See NVIDIA Deep Learning In Action [Webinar Series] - Sep 13, 2018.
Hear how three companies benefitted from the performance, simplicity and convenience of NVIDIA DGX Station to supercharge their deep learning development, infusing their products and services with the power of AI.
AI, Deep Learning, Deployment, NVIDIA
- Nvidia: AI Training for Self-Driving Vehicles [On-demand Webinar] - Aug 27, 2018.
We discuss the key considerations in selecting the optimal AI infrastructure required to train deep neural networks for safe self-driving systems, including data requirements and computing performance needed, and how to use NVIDIA DGX-1 for training autonomous vehicles.
AI, Deep Learning, NVIDIA, Self-Driving Car
- Deep Learning and Challenges of Scale Webinar - Jul 9, 2018.
Join Nvidia for an on-demand webinar to learn how to tackle the challenges of scaling and building complex deep learning systems.
Algorithms, Deep Learning, NVIDIA, Scalability
- Score a Nvidia Titan V GPU at AnacondaCON 2018 - Mar 21, 2018.
At AnacondaCON 2018 in Austin, Apr 8-11, you'll learn how data scientists are using GPUs for machine learning across a variety of applications and industries. The best part? One lucky attendee will receive a FREE NVIDIA TITAN V GPU!
Anaconda, Austin, Data Science, GPU, Machine Learning, NVIDIA, TX
- NVIDIA: AI Developer Technology Engineer - Jan 4, 2018.
NVIDIA is looking for a passionate, world-class computer scientist to work in its Compute Developer Technology (Devtech) team as an AI Developer Technology Engineer.
AI, CA, Developer, NVIDIA, Santa Clara
- NVIDIA: Sr Deep Learning Software Engineer - Jan 4, 2018.
NVIDIA is seeking a Senior Deep Learning Software Engineer to join their Autonomous Vehicles team to develop state of the art Deep Learning / AI algorithms for our advanced Autonomous driving platform.
Boulder, CO, Deep Learning, NVIDIA, Software Engineer
- NVIDIA: Deep Learning Inference Software Engineer (TensorRT) - Jan 4, 2018.
NVIDIA is seeking a Senior Deep Learning Inference Software Engineer (TensorRT), and hiring software engineers for its GPU-accelerated Deep learning team.
CA, Deep Learning, NVIDIA, Santa Clara, Software Engineer
- NVIDIA: Director – Computer Vision/Deep Learning for Imaging Software - Jan 4, 2018.
Nvidia is seeking an experienced engineering leader to direct our Deep Learning and Computer Vision efforts in Imaging software.
CA, Computer Vision, Deep Learning, Director, NVIDIA, Santa Clara
- NVIDIA DGX Systems – Deep Learning Software Whitepaper - Nov 20, 2017.
Download this whitepaper from NVIDIA DGX Systems, and gain insight into the engineering expertise and innovation found in pre-optimized deep learning frameworks available only on NVIDIA DGX Systems and learn how to dramatically reduce your engineering costs using today’s most popular frameworks.
Deep Learning, ebook, Free ebook, GPU, Neural Networks, NVIDIA
- Download NVIDIA DGX Systems eBook - Oct 17, 2017.
In this eBook, you will learn how NVIDIA DGX Systems offer the fastest path to AI and deep learning, how to spend more time focused on experimentation and less time wrestling with IT, and using DGX Systems include access to NVIDIA-optimized deep learning frameworks.
AI, ebook, Free ebook, NVIDIA
- Top 10 Recent AI videos on YouTube - May 10, 2017.
Top viewed videos on artificial intelligence since 2016 include great talks and lecture series from MIT and Caltech, Google Tech Talks on AI.
AI, Google, Machine Learning, MIT, Neural Networks, NVIDIA, Robots, Youtube
Deep Learning – Past, Present, and Future - May 2, 2017.
There is a lot of buzz around deep learning technology. First developed in the 1940s, deep learning was meant to simulate neural networks found in brains, but in the last decade 3 key developments have unleashed its potential.
Pages: 1 2
Andrew Ng, Big Data, Deep Learning, Geoff Hinton, Google, GPU, History, Neural Networks, NVIDIA
- The HPI Future SOC Lab offers researchers free access to a powerful Big Data infrastructure - Mar 10, 2017.
The HPI Future SOC (Service-Oriented Computing) Lab is a cooperation of the Hasso Plattner Institute (HPI) and industrial partners, providing free access to a powerful Big Data & Computing infrastructure. It is now accepting project proposals for 2017.
Cloud Computing, Data Incubator, In-Memory Computing, Machine Learning, NVIDIA, Research
- Why Go Long on Artificial Intelligence? - Feb 17, 2017.
We are now at the right place and time for AI to be the set of technology advancements that can help us solve challenges where answers reside in data. While we have already seen a few AI bull and bear markets since the 50’s, this time it’s different. If I and others are right, the implications are immensely valuable for all.
AI, Artificial Intelligence, Investment, NVIDIA
- Data Science Deployments With Docker - Dec 1, 2016.
With the recent release of NVIDIA’s nvidia-docker tool, accessing GPUs from within Docker is a breeze. In this tutorial we’ll walk you through setting up nvidia-docker so you too can deploy machine learning models with ease.
Data Science, Docker, GPU, indico, NVIDIA
- Parallelism in Machine Learning: GPUs, CUDA, and Practical Applications - Nov 10, 2016.
The lack of parallel processing in machine learning tasks inhibits economy of performance, yet it may very well be worth the trouble. Read on for an introductory overview to GPU-based parallelism, the CUDA framework, and some thoughts on practical implementation.
Pages: 1 2
Algorithms, CUDA, GPU, NVIDIA, Parallelism
- NVIDIA: Deep Learning Library Software Development Engineer - Oct 20, 2016.
To researchers and companies are using GPUs to power a revolution in deep learning, enabling breakthroughs in problems from image classification to speech recognition to natural language processing. Join the team which is building software which will be used by the entire world.
CA, Deep Learning, GPU, NVIDIA, Santa Clara, Software Engineer
- NVIDIA: Developer Technology Engineer – Autonomous Driving - Oct 19, 2016.
Seeking a Developer Technology Engineer – Autonomous Driving to be a member of the automotive team. The candidate will be responsible for working with cutting-edge applications of Deep Learning, computer vision and image processing on NVIDIA’s next-generation automotive products.
CA, Deep Learning, Developer, Engineer, NVIDIA, Santa Clara, Self-Driving Car
- NVIDIA: Technical Account Manager - Oct 19, 2016.
Seeking a Technical Account Manager/Sales Engineer to join the team supporting development and sales activities for the Facebook account with a focus across multiple domains including: Artificial Intelligence, Video, Virtual Reality, Server Design and Data Center Integration.
CA, Manager, Member of Technical Staff, NVIDIA, Sales Engineer, Santa Clara
- NVIDIA: Solution Architect (Eastern Region) - Oct 19, 2016.
Seeking a world-class engineer/scientist for an exciting role as a Solutions Architect. Work with the most exciting high-performance computing hardware, software and impactful projects.
Deep Learning, Developer, Engineer, NVIDIA, NY, Telecommute
- NVIDIA: Director for Autonomous Vehicle Localization and Mapping - Oct 19, 2016.
Seeking a Director for Autonomous Vehicle Localization and Mapping, requiring someone who can formulate and execute a technical roadmap from architectural specification through the full lifecycle of the product in the field.
CA, Director, Maps, NVIDIA, Santa Clara, Self-Driving Car
- NVIDIA: Senior Deep Learning R&D Engineer – Autonomous Driving - Oct 19, 2016.
Seeking a Senior Deep Learning R&D Engineer, with the opportunity to innovate from algorithms, to system design, to processor architecture and see your work used in cars all over the world.
Deep Learning, Engineer, Holmdel, NJ, NVIDIA, Self-Driving Car
- NVIDIA: Senior Research Scientist (Deep Learning) - Oct 19, 2016.
Seeking a Senior Research Scientist (Deep Learning) to conceive deep learning approaches to solving particular product problems, construct and curate large problem specific datasets, and design and implement machine learning techniques aimed at solving specific problems.
CA, Deep Learning, NVIDIA, Research Scientist, Santa Clara
- Basics of GPU Computing for Data Scientists - Apr 7, 2016.
With the rise of neural network in data science, the demand for computationally extensive machines lead to GPUs. Learn how you can get started with GPUs & algorithms which could leverage them.
Algorithms, CUDA, Data Science, GPU, NVIDIA
- NVIDIA: Senior Data Mining Analyst - Aug 16, 2015.
Fill a key role in our Business Planning & Analytics Team, the analytic hub of NVIDIA product marketing organization.
CA, Data Mining Analyst, GPU, NVIDIA, Santa Clara
- Top KDnuggets tweets, Jul 21-27: Beginner Guide to Time Series Analysis; Free Deep Learning online course - Jul 28, 2015.
Beginner #Guide to #TimeSeries #Analysis; Nvidia free #online course: Intro to #DeepLearning ; To Code or Not to Code with @KNIME; Guide To Linear #Regression
Deep Learning, Knime, NVIDIA, Quants, Time Series
- MSc in Data Science/Executive Big Data Analysis in Paris or Nice - Jun 1, 2015.
Get MSc in Data Science or an MSc in Executive Big Data Analysis with DSTI intensive programmes in Paris or Nice campuses.
Data ScienceTech Institute, DSTI, France, MS in Data Science, Nice, NVIDIA, Paris
- Deep Learning with Structure – a preview - May 6, 2015.
A big problem with Deep Learning networks is that their internal representation lacks interpretability. At the upcoming #DeepLearning Summit, Charlie Tang, a student of Geoff Hinton, will present an approach to address this concern - here is a preview.
Deep Learning, Geoff Hinton, Image Recognition, NVIDIA, RE.WORK
- CuDNN – A new library for Deep Learning - Sep 19, 2014.
Becoming more and more popular, deep learning is proved to be useful in artificial intelligence. Last week, NVIDIA’s new library for deep neural networks, cuDNN, has attracted much attention.
Convolutional Neural Networks, Deep Learning, GPU, NVIDIA, Yann LeCun