Compelling Use Cases For Enterprise AI

Compelling Use Cases For Enterprise AI

It goes without saying that AI is taking our world by storm. From our daily ChatGPT use to self-driving cars, our lives are more influenced by intelligent robots. Will they ever take over our world? Afraid not. We are still far from the Skynet reality – made famous by the Terminator universe. Nonetheless, whether we love it or hate it, we need to embrace AI or they might just embrace us. One key area in which AI is being adopted is in the enterprise sector. Here are some compelling examples of how we can boost productivity and efficiency in our businesses.

Semantic Search

We use keyword search almost on a daily basis – also known as Google Search. This type of informational search uses keywords to identify the things we seek. For example, if we search “people with similar life stories to Abraham Lincoln,” Google Search could probably respond by revealing a list of sites containing websites that have an article comparing the life of Lincoln with other notable individuals.

Semantic search, on the other hand, focuses on understanding the meaning and intent behind queries rather than just matching keywords. It can interpret the input of “people with similar life stories to Abraham Lincoln,” and return contextually relevant results by searching its database and finding people who actually have similar life events, such as those born and died in the United States during the 19th century who served in the military, had careers in law and politics, as well as notable works of repelling slavery in the U.S.

We experience semantic search when conversing with ChatGPT. It somehow understands the meaning of our inquiry without much elaboration and this is particularly true when we are talking with it in a single chat thread. It can understand the context at hand.

An enterprise can adopt semantic search in its internal database system by incorporating a vector database and an LLM (large language model). It could also fortify this AI system with RAG (Retrieval Augmented Generation) framework to further optimize the process of finding accurate internal information.

If your organization has a massive database of internal data and you wish to reduce the time for your staff to seek and gather relevant information, semantic search is the key.

AI Chatbot

So, what’s wrong with the current chatbots? Well, they aren’t intelligent. These are called programmable chatbots where they are programmed to know and do what is told to them exactly. This is reminiscent of the IBM Deep Blue supercomputer that competed against Garry Kasparov, the reigning world chess champion in 1986. Deep Blue was pre-programmed with chess moves and strategies by IBM engineers. It was not able to think by itself.

A true AI chatbot would have the ability to think of itself, by itself, and for the company it serves. It needs to be trained with a company’s information to respond to customers’ inquiries based on what it knows and in the form of a dialogue. Again, it’s just like having a private ChatGPT for your organization.

We build an AI chatbot, we require the use of an LLM (large language model). A larger language model (LLM) is a type of AI system that can understand, generate, and process human language. The world’s most famous LLM is GPT – the AI model that powers ChatGPT. We can also use other open-source LLMs such as LLaMA, DeepSeek, Mistral, Falcon, PaLM, BERT and more.

AI-Powered Legal Research

Ever wonder why lawyers’ offices are usually stacked with big bookshelves filled with texts? It’s mainly for legal research purposes where they need to reference something like a term, legal case, statute, or court decision of some kind.

All this manual work usually goes to paralegals, clerks, trainees, or junior associates. It would be much more efficient to use an AI instead.

I remember watching a documentary over 15 years ago demonstrating a computer-based legal research software going against a paralegal in searching for legal information. The computer, which may not be a full-blown AI then, beat the person in looking for the answer in mere seconds compared to the person who took more than a minute. This is proof that the tip of the AI iceberg is so tiny.

Virtual Assistant

One day, we may not need secretaries anymore. It is a bitter pill to swallow because today, we can already use AI to perform rudimentary tasks such as researching for products online, filling in online forms, creating spreadsheets and more. OpenAI is already offering such virtual assistance service called Operator.

This Operator is most probably built to perform general online tasks. But if your enterprise requires something more niche such as using your internal workflow management system or ordering factory supplies before they run out, you may just need to build one yourself.

Leave a Reply

Your email address will not be published. Required fields are marked *