Evolving Business Application of Natural Language Processing

Natural language technologies are evolving to a new and more powerful role. Previously, natural language generally connected consumers to transaction systems. Now, businesses are integrating natural language content in processes orchestrated by robotic process automation with decisions optimized by machine learning.

The earliest natural language processing adopters, in the intelligence community, investment managers, and law firms, used NLP to manage very high volumes of documents. They measured sentiment in social media and assessed samples from billions of documents in class action suits. However, these applications perform a point function rather than powering an end-to-end workflow.

The new generation of workflows integrate steps that analyze language rather than simply map language onto other systems. These applications summarize and generate documents, spot nuanced differences in series of related documents such as regulatory filings, and monitor the social physics that influence organizations’ business and compliance results.

Business applications differ from consumer because people communicate differently in each context. Consumer content such as tweets, blogs, or product reviews are short, focus on one topic, and express clear sentiment. In contrast, business content, such as Economist articles or analyst research from investment banks, is typically longer, references multiple related entities such as companies within an industry, and expresses sentiment more subtly.

Natural language processing can be simplified into these steps:

1.     Language detection.

2.     Sentence separation and tokenization, which normalizes words. For example, converting “aren’t” to “are not”, or separating words in languages, such as German, that combine words into compound nouns.

3.     Entity extraction, also called topic modelling, that identifies companies, people, places, etc.

4.     Part of speech analysis, also known as syntactic analysis, that parses sentence structure such as nouns and verbs.

5.     Semantic analysis discerning the meaning of smaller sections such as phrases and sentences.

6.     Pragmatic analysis, also called contextual analysis, that analyzes how words and phrases are positioned relative to each other. For example, an entity or sentiment may be weighted more heavily if the are in the leading sentence or first paragraph of an article.

NLP is usually just a part of a broader process. Business applications often put NLP results within context to make a decision. For example, consider an investor analyzing how stakeholders view a company. Topics and sentiment for a given article or point in time are less informative then trends, comparisons with other companies, or the topics that drove changes to sentiment.

Practical experience leads to three core lessons for business NLP:

1.     Build domain-specific models. General purpose commercial offerings like Microsoft Azure or Amazon Web Services are effective for general purpose tasks, such as identifying the language of text, but less effective for specific business decisions within the context of a given company.

2.     Manage data thoughtfully. Explicitly plan and design data sets used to train models. Identify variations, edge cases, and exceptions to ensure they are included. Be aware of any subjective biases held by people creating training data sets. Build large data sets to lessen the impact of any individuals with outlying views. Watch for industries with more complex scenarios. For example, pharmaceuticals can produce complex blends of negatives and positives, e.g. “the promising new drug expected to cure Alzheimer’s has not performed in clinical trials”.

3.     Language is inherently subjective. Set expectations that NLP is imprecise because people are imprecise. Different perspectives can produce different subjective views. For example, we have seen clients that reasonably viewed winning a law suit as a positive event. However, consumers were simply reminded of the allegations and the news actually hurt how the company is viewed. View with skepticism any claims that a natural language application can produce results with certainty.

Technology for NLP has evolved from hand-coded rules, to statistical approaches based on probability, and now to machine learning that continually improves. Machine learning models find subtler relationships in data, bring more resilience to unexpected inputs, and find answers with partial input, e.g. broadcast media transcripts often include spoken speech patterns with partial sentences and phrases.

As with all analytic projects, architects should not automatically assume machine learning is the solution. The simplest and most cost effective path to achieve business goals often includes a mixture of rules, classic statistical models, and machine learning.

A real world application may first use AWS or Azure for language recognition or translation. Next, the application may use manually defined rules to eliminate irrelevant messages. For example, social media can include many references to a company’s sponsorship of sports teams, which has little business meaning. Pattern matching rules can also resolve ambiguous names like Orange, the telco, versus fruit, or recognize that Coca Cola and Coke refer to the same company. Rules reduce the volume of data moving to more expensive and operationally risky steps in a process.

Commercial services are evolving rapidly. A recent informal assessment for a client confirmed prior views. AWS works well with consumer content and supports many languages. Azure is accurate and fast but less feature rich. Google includes interesting features such as the proportion of emotional content in a document. IBM is arguably the most effective at complex business NLP needs like understanding long documents, but is expensive and requires moving data to and from the IBM cloud.

Looking forward, many business processes will use natural language data orchestrated by robotic process automation with decisions optimized by artificial intelligence models. Fusing these three technologies will drive the future of business process automation.

___________________________________________________________

Je