Loading…
This event has ended. View the official site or create your own event → Check it out
This event has ended. Create your own
Data By the Bay is the first Data Grid conference matrix with 6 vertical application areas  spanned by multiple horizontal data pipelines, platforms, and algorithms.  We are unifying data science and data engineering, showing what really works to run businesses at scale.
View analytic
Thursday, May 19 • 1:10pm - 1:30pm
Fast deep recurrent net training

Sign up or log in to save this to your schedule and see who's attending!

Deep recurrent nets are the extension of deep neural nets to process / output sequential data. They have exploded into the deep learning scene over the past few years, are no longer considered hard to train, and have enabled us today to make progress on everything from speech recognition, and language modeling to image captioning. In this talk, we will look at what recurrent nets can do for you, and go over some tips and tricks we've learnt from building Deep Speech for training seriously deep recurrent networks on your own.

Some knowledge of recurrent nets is expected, like having read http://karpathy.github.io/2015/05/21/rnn-effectiveness/

Speakers
avatar for Sanjeev Satheesh

Sanjeev Satheesh

Deep learning researcher, Baidu USA
Sanjeev works as a Deep learning researcher at the Silicon Valley AI Lab at Baidu USA. SVAIL has been focused on the mission of using hard AI technologies to impact hundreds of millions of users.


Thursday May 19, 2016 1:10pm - 1:30pm
Gardner

Attendees (26)