Deep Learning is affecting our everyday lives, and it can be used to unlock even more valuable insights in your data.

But before I get into how that’s possible, I’ll talk about what deep learning is, and how it is used.

Deep Learning is a technology that allows computers to learn like “animals” – which means that computers mimic nature’s ability to learn.

While computers may not need to learn the same things that natural beings learn in their lives, they learn to work with the data that humans don’t liketo work with for a number of reasons:

1) The data is too vast. It’s impossible for a group of humans to comb through it and find meaningful patterns that can help you or your business.

2) The data is too complex. There are too many relationships between seemingly unrelated content – which humans and regular computing algorithms simply can’t cope with at great efficiency or performance.

Because Deep Learning allows computers to understand the data that even humans find difficult to understand, they’ve been applied in hundreds of different fields, with very positive results. These fields include:

  • Banking
  • Insurance
  • Fashion
  • Technology
  • Food
  • Retail
  • Business
  • Everyday Tasks And so much more!

However, Deep Learning as a technology is very complicated – you’re developing algorithms to understand data that you, yourself, can’t understand. This is what stops most developers, businesses, enterprises, or even regular people, from trying to unlock the power of Deep Learning on the data that they’re gathering every day.

To combat this, Deep Learning APIs exist – these APIs can function through many different interfaces, including code – however, IBM Watson took a different approach to this problem.
IBM Watson is a suite of REST APIs that provide powerful Machine Learning algorithms that can be applied to your own data – in fact, since it’s a REST interface, you can incorporate this on almost any platform imaginable: big, or small.

I’ve been able to create a lot with IBM Watson, including:

1) Audiology: Hearing Disorder Diagnosis
2) Chatbot-controlled Drones (TanDro)
3) NLQA (Natural Language Question Answering) Systems (AskTanmay)
And many more!

However, while IBM Watson provides a vast variety of APIs, enabling it to be used in thousands of use cases, there’s one problem that a “Deep Learning Researcher” would face: it’s not flexible. You can’t invent new algorithms with Watson (at least, yet).

Since DL (Deep Learning) Researchers wanted a faster way to prototype, build, and test their deep learning systems, they went ahead and created AI Toolkits that work in real code, offline – you may have heard of a few of these libraries, namely, Tensorflow, Theano, MXNet, Keras, and more! I won’t expand too much on how these actually work, but if you’d like to find out, you can go to their GitHub repositories.

To give you a glimpse into what’s possible with Deep Learning, I’d like to show you a few of the different use-cases in which I’ve been able to incorporate this technology.

Natural Language Processing

NLP is a very difficult task for computers. Mainly because of 2 reasons:

1) Vocabulary is very vast (one word could have, say, 10 synonyms)
2) Grammar is very dynamic (one sentence or paragraph could be restructured or phrased differently in hundreds of different ways)

While there have been attempts in the past, sometimes rather successful, to understand natural language using algorithms, the accuracy and performance of those systems pale in comparison to that of Deep Learning.

Deep Learning allows algorithms like Neural Networks to understand the semantics of Natural Language in numeric form. Once the Neural Network has learned how to understand those semantics, it should be ready to perform – even if it hasn’t seen the entire English vocabulary, since it’ll account for that in the semantic nature of the input.

One example of how I was able to use Deep Learning to outperform a regular expression based Natural Language Processing task, is my project called DeepSPADE, which stands for Deep Spam Detection.

As we know, StackExchange (SE) is a network of 169 forum sites, with topics ranging from Programming, Linux, and Math, to Cooking, Aeronautics, Health, and a lot more.

StackOverflow, the most popular of the SE suite, has over 14,500,000 questions in the last 8 years it’s been up – with over 6,500,000 of those questions being confirmed as answered.

However, StackOverflow alone has over 30 spam posts per day – which adds up over time, becoming a lot of spam.

To combat this, a group of programmers, called Charcoal SE, went ahead and developed a RegEx-Based system that finds spam based on complex RegEx patterns.

Once I found out about this, I immediately set out to build a Deep Learning based Spam Classification system for the Stack Exchange network. 7 days later, I had created DeepSPADE.

DeepSPADE is an extremely deep Neural Network that can take in a post from StackExchange, and tell you whether or not it consists of Spam – it’s reached 99.1% accuracy on 16,000 testing rows.

But wait – why is this important to you? After all, we’re only getting 30 posts per day? Even if the spam adds up, what harm could it cause?

Well, there are 2 reasons:
1) Spam kills users. The more spam you have on your website, the less that your users will like your website.
2) Spam costs money. Let’s say only 10 of the 30 spam posts on your website get reported per day. In one week, you collect 140 posts – in a month, 560 – in a year, 6,720. In 7 years, 47,070. You’re wasting a huge amount of server space hosting content that’s completely invaluable for, in fact, degrading the value of, your website. Plus, it’s a pain for the loyal users who trust the forum.

This can unlock huge potential for your business – it might not even be for spam detection in specific. The best part about Deep Learning is that, without modifying a single line of code, you can make this same algorithm classify NL Sentiment, classify Fake from Real Reviews, run Answer Retrieval, Generate Images, Predict Gas-, Stock-, or Gold-Prices, or any other sequential data problem.

Deep Learning has been around for over 60 years – but only recently has it become popular. With the advent of GPUs, with much more computing power than we’ve ever had, Deep Learning has been put in the hands of more researchers and developers. Thus, DL is now getting into the hands of more consumers and the products they use everyday, with technologies like CoreML by Apple, and Tensorflow by Google.

Now, you can start using the power of Deep Learning without having to touch a single line of code, with IBM Watson and APIs like Keras!


Editor's Note:
We are on a mission to create an inclusive learning ecosystem that helps us transition towards an AI-first world. Kindly support our mission by using the 'Comments' section below, to share your reaction to this article, your learnings in DL, & questions for Tanmay. Your comments will help us invite you to our community that cares about how we learn and adopt AI in our lives & businesses. Thank you for your patience and support. :-)

Tanmay Bakshi

Thirteen-year old Software & Cognitive Developer, TEDx & Keynote Speaker, Honorary IBM Cloud Advisor & IBM Champion, author of book “Hello Swift! - iOS App Programming for Kids and Other Beginners”.