Deep Learning for Text

Deep Learning has created a small revolution in the natural language processing (NLP) community. These methods represent words not as atomic units but as points in a high-dimensional space, allowing much more fine-grained treatment. Deep Learning models learn these representations using non-linear functions, in order to convey the human intuition that lies behind natural language. Some examples on how we use them in our research: Language Modeling: This is the ability to anticipate likely continuations of a sequence of words, is central to many areas of NLP (speech recognition, machine translation, natural language generation, image captioning, dialogue). We investigate the use of Recurrent Neural Networks (RNN), and related models (LSTM, memory networks, etc) for this and how to modify them to exploit long-distance dependencies while preserving learnability, and to focus attention on…


Link to Full Article: Deep Learning for Text

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!