A History of Natural Language Processing

When we last left off, we went over the basics of natural language processing. But in reality, it is much more complicated than that, for it has taken over fifty years for it to become as important as it is today. So from the help of our Data Science Intern Dennis, we dove little deeper, going into how it works, how to actually use NLP and some of the hottest trends in NLP today.

The 1960s: The Stone Age

We know the 60s weren’t that long ago, but in terms of NLP, it might as well have been the Stone Age. When this concept was incepted in the 50s, they didn’t have the processing power to do any of the complicated tasks they can perform today, so researchers had to work with very simple methods, mainly word and pattern recognition.

ELIZA: the Internet’s therapist

One of the earliest applications of word and pattern recognition was ELIZA, a chatbot that pretended to be a psychotherapist. Through basic pattern recognition, ELIZA could recognize types of statements and generate a response. For example, if the user said “My head hurts,” ELIZA ask “Why does your head hurt?” or “How does that make you feel?” The primitive nature of ELIZA, however, meant she could only generate canned responses, leading to illogical answers to users.

Despite the naiveté of these models, they are still used today in many basic tasks. The creation of regular expressions, sequences that represent patterns of letters, numbers, and symbols, has made it incredibly simple for developers to find set patterns within text. Here at Welcome, we still use these basic pattern matching models to do preliminary analysis of conversations in order to provide some basic information to our moderators about what they are looking at.


1990s - 2000s: Accelerated Efforts

After a long time of stagnant NLP development, the technology boom jumpstarted NLP, resulting in an explosion of new machine learning algorithms. Since researchers had easier access to faster computers, they could implement these models in a more consumer-friendly manner.

Computers understand human speech

The biggest result of these changes? Speech recognition. Back in the 1960s researchers know that hidden markov models, which can determine the state of a system from its outputs, could be applied to translate human sounds to words. The only problem was they didn’t have the computer power needed to make that a reality. So it wasn’t until the early 2000s when processors had finally become fast enough for common gadgets to perform these calculations in almost real-time.


Today: Big Data and Deep Learning

There have been two of the major barriers for machine learning algorithms that has prevented it from evolving: the speed at which data can be analyzed, and the quantity of data there is to analyze.

Now that we live in a world where virtually everyone has a computer or a smartphone, it is incredibly easy for large technology companies to collect records of what websites we visit, what we say to others, and more. In fact, the data collected has become so diverse and complicated that it is literally impossible for humans to recognize all the patterns needed to form interesting conclusions. As a result, one type of machine learning model which thrives off big data has become incredibly popular: neural networks through deep learning.

Computers: as smart as the human brain?

Neural networks basically try to model the human brain in how they process information. They create a graph of neurons and transmit signals of data between these nodes. This allows systems to theoretically learn any data pattern, including human speech and language. Due to the sheer power of these models, they are now used extensively in virtually every popular technology, from Google Translate to self-driving cars.

The accuracy of neural networks can seem unbelievable, and even scary, in some cases. A Stanford PhD student created a neural network based on Shakespeare’s complete works. After the model was created, it self-produced this King Lear monologue on its own:

King Lear:
O, if you were a feeble sight, the courtesy of your law,
Your sight and several breath, will wear the gods
With his heads, and my hands are wonder'd at the deeds,
So drop upon your lordship's head, and your opinion
Shall be against your honour.

Neural networks have the capability of recognizing patterns in language that no human ever could, and as we continue to collect more and more data, they will only become increasingly more robust and effective. At Welcome, we hope to leverage this technology for a variety of reasons, ranging from fully automated chat tagging to chatbots.


The Near Future

Though effective, neural network technology is still relatively young, and new breakthroughs are made regularly. Cutting-edge research is focusing on removing any limitations that currently exist.

Bigger, faster, stronger

Research is now aiming to create algorithms that allow neural networks to be built off smaller data sets. If this research is successful, computers will be able to complete highly specialized tasks quickly without nearly as much preparation as is required today. Additionally, companies such as Nvidia and Google have begun building specialized hardware for machine learning to create increasingly complex models from increasingly complex data. Experts in the field have even started using this hardware to run simulations to collect more data faster.

Accessible to all

Machine learning is also becoming more accessible to developers. Open source resources such as Google’s Tensorflow and SyntaxNet make it easier to implement machine learning models, and the rise of cloud computing makes it simple to distribute these processes in order to make them faster. Companies like Apple and Google have already begun making their models available to developers, so even those who lack the mathematical background traditionally needed to use machine learning can leverage its power.


The possibilities are endless

As NLP continues to develop, it will permeate more and more of our technology, and hopefully be usable for just about anyone. Watch out for the next post in our series, where we go into the nitty gritty of natural language processing and how Welcome uses it to power our data intelligence.

Discover how NLP is crucially changing online shopping >