KDnuggets Home » News » 2017 » Jan » News, Features » Top /r/MachineLearning Posts, December: OpenAI Universe; Deep Learning MOOC For Coders; Musk: Tesla Gets Awesome-er ( 17:n01 )

Top /r/MachineLearning Posts, December: OpenAI Universe; Deep Learning MOOC For Coders; Musk: Tesla Gets Awesome-er


 
  http likes 29

OpenAI Universe; Deep Learning For Coders—18 hours of lessons for free; Elon Musk on Twitter: Tesla Autopilot vision neural net now working well; Apple to Start Publishing AI Research; Duolingo's "half-life regression" method for modeling human memory



December on /r/MachineLearning brought us OpenAI's Universe, a Deep Learning MOOC for coders, Elon Musk trumpeting Tesla's autopilot functionality, Apple's decision to get back to innovating and participating in open research, and language learning's revolution of "a new statistical model."

The top 5 /r/MachineLearning posts of the past month are:

1. OpenAI Universe

OpenAI has released Universe, a platform for training an AI's general intelligence "across the world's supply of games, websites and other applications." From the OpenAI blog post:

Universe allows an AI agent to use a computer like a human does: by looking at screen pixels and operating a virtual keyboard and mouse. We must train AI systems on the full range of tasks we expect them to solve, and Universe lets us train a single agent on any task a human can complete with a computer.

OpenAI Universe

One can't help but comment on the ease of entry into "artificial intelligence" these days. The "let us help you help us" model of AI training-slash-research is poised to (hopefully) pay dividends moving forward.

2. Deep Learning For Coders—18 hours of lessons for free

Are you a coder interested in learning Deep Learning? This course may be for you.

Deep Learning MOOCWelcome to fast.ai's 7 week course, Practical Deep Learning For Coders, Part 1, taught by Jeremy Howard (Kaggle's #1 competitor 2 years running, and founder of Enlitic). Learn how to build state of the art models without needing graduate-level math—but also without dumbing anything down. Oh and one other thing... it's totally free!

3. Elon Musk on Twitter: Tesla Autopilot vision neural net now working well. Just need to get a lot of road time to validate in a wide range of environments.

Musk PSA: Tesla gets more awesome, neural nets are better drivers than you.

Musk tweet

This is merely a link to a tweet (shown above). Nothing more. Move along...

4. Apple to Start Publishing AI Research

This is a link to a Bloomberg article on Apple's decision to start publishing Deep Learning related research, in an attempt to accelerate the state of the technology. Why now?

Researchers say among the reasons Apple has failed to keep pace is its unwillingness to allow its AI engineers to publish scientific papers, stymieing its ability to feed off wider advances in the field.

There has been a feeling that Apple may be losing its innovative edge, and contributing back to the research community, and being able to take advantage of the research of others in return -- and be a more appealing place of employ for research-oriented potential employees -- may help reverse either this feeling or the underlying reality.

The short article also mentions the numerous AI startups that Apple has acquired recently, including Seattle's Turi.

5. Duolingo's "half-life regression" method for modeling human memory

This is a link to a Duolingo blog post titled "How we learn how you learn." It's an insight into how the language-learning website makes use of data.

[W]e're serious about taking a scientific, data-driven approach to all of our products, and about sharing what we learn with the world. In this post, we'll take a look at the science behind the Duolingo skill strength meter, which we published in an Association of Computational Linguistics article earlier this year...

Duolingo Weak Skills

The post outlines Duolingo's half-life regression statistical model (which is also covered in the cited paper.

Through our research, we invented a new statistical model we call half-life regression (HLR), inspired by other methods used in "big data" like logistic regression, but using an exponential probability function...

The treatment in the blog post is nearly math-less, and gives a plain English overview, which can be elaborated on, if desired, by reading the accompanying paper.

Related:


Sign Up

By subscribing you accept KDnuggets Privacy Policy