Conversations for business applications

Après le Web et les applications mobiles, les conversations ouvrent un nouvel accès aux applications d'entreprise.

Making new out of old

ChatGPT highlighted the use of natural language to carry out complex searches and obtain answers that are easy to use.
Natural language is human language, the kind we use in everyday life.

At the end of 2022, OpenAI made chatGPT available to the public, revealing all the capabilities of generative AI.

Novelty does not lie in the search method. 
With search engines, the user has already proceeded with keywords or short phrases.
But these provide a multitude of links in return. Links that must be explored, browsed, synthesized, and sometimes even ignored.

Whereas chatGPT and other LLMs (large language models) write complete (if not accurate) answers.
It can even perform certain standardized tasks, such as writing a publication in journalistic style or generating code.

These AIs (Artificial Intelligence) generate a written, organized proposal, and the user can then iterate:
“The text is too long”,
“Not direct enough”,
“Rewrite it like an expert”, etc.

Magical.

As early as 1950, Alan Turing developed the concept of talking (or even intelligent) machines, their performance, and the famous eponymous test. The first chatbots (ELIZA and a few others) date back to 1966- 1970, and the first commercial applications (notably Alice) appeared at the turn of the 2000s.

However, chatbots suffer from poor reputation and several shortcomings: lack of contextual understanding, inflexible interactions, limited learning, and so on.

Their mediocre performance coupled with a frustrating experience make them ultimately of little use.
We have every reason to believe that recent advances will reshape this experience.

When users meet applications

But let’s take a quick look at the major stages in the relationship between users and enterprise applications.

Evolution des accès aux applications d'entreprise : après le web et les applications mobiles, les conversations

Encounter of the first kind: the Web

Technological developments in the 1990s (fast Internet, the Web, JavaScript, XML, etc.) led to the advent of the Cloud in the early 2000s, followed by the gradual spread of Web interfaces for business applications.

It was no longer necessary to load a specific application (known as a “thick client”) to access each of these applications.

Encounter of the second kind: mobile applications

Ten years later (2007), the arrival of Apple’s first iPhones and their “store” of applications offered rich, evolving mobile access. The experience is revolutionary and rewarding for users.

Driven by a growing need for simplicity, speed, and mobility, the success of mobile applications has been meteoric.

While this was initially the case in the private sphere, it quickly spread to the business world too.

Mobile applications had to meet effective business needs.

For example, the need to quickly process a simple, repetitive, or low-value action.

Mobile applications also met the need to extend the experience on the move.

Encounter of the third kind: conversational access

In this new stage of bringing applications closer to their users, the home sector is also leading the way.

With over 100 million units sold every year, connected speakers (or voice assistants) seem to have reached cruising speed, dominated by Amazon with almost 2/3 of the market.

And the experience of private customers appears to be quite satisfactory.

Adoption des assistants vocaux

A 2019 study by Cap Gemini revealed that “Despite 76% of companies having realized quantifiable benefits from voice or chat assistants, businesses need to focus on better meeting customer needs to realize the true potential.”

Conversational interface has yet to penetrate the professional sector. This means meeting new challenges:                                        

  • Technical: security, confidentiality, considering the IT environment
  • Functional: variety of applications and versatility of use
  • Economic: cost per user must be negligible

Agora answers the challenges of conversational access for business applications

A sovereign, trusted solution

Data security and confidentiality are at the heart of our technical architecture. We aim to offer a sovereign, trusted solution that complies with regulations (GDPR).

Agora’s AI is specialized in offering high accuracy in the recognition of intentions related to each project while requiring few computing resources.

This makes it possible to dedicate a specific NLP function to each project, offering a single level of control:

  • The automatic language processing function is dedicated to each project. This eliminates the risk of “contamination” of data or intentions between projects or customers.
  • The administrator himself defines the training data, from which the neural networks making up the project’s NLP are automatically parameterized. He can therefore ensure that the bot functions correctly within its functional perimeter, add special cases, prohibit certain interactions, etc.

At the customer project level:

  • Dedicated IT resources (docker containers, AI/NLP and database) for each project,

  • Anonymization of sensitive data communicated by users via conversational interfaces,

  • An SSO function to ensure compliance with each customer’s authentication policy.

For the Agora platform, we guarantee a high level of security, as the solution has been designed in “security by design” mode, by experts in the field.

The Agora solution is deployed on three hosting infrastructures located in France and geographically remote.

Orchestration mechanisms enable load balancing and the automatic handling of major failures that may occur at one of the hosting providers.

Customization capability

The universality of human language opens the door to conversational projects in all fields of activity: HRIS, resource management, ERP, Finance, Logistics, Industry, Communities, Homecare, etc.

A high-quality conversational experience requires considering the company’s operational specificities, such as the actions proposed to users, proper nouns, product names, project names… as well as any specialized vocabulary.

Agora’s two studios (no-code and low-code) make it easy to define interactions and configure conversational processes.

Defining an intention on the Natural Language studio consists of:

  • Integrating specialized vocabulary where necessary.
  • Build the data set to train the Agora Artificial Intelligence. All you need to do is enter a few sample sentences, and most of the training set will be automatically generated. This is made possible by the direct integration of LLM into the Agora studio.
  • Test and implement intentions in your project: automatic training from the Agora console produces a dedicated language model in just a few seconds.

A real lever for efficiency and profitability

Agora adapts to most technical architectures and applications used by companies and publishers, for deployment without disruption or side-effects.

The main prerequisite is, of course, that the applications to be made conversational are accessible from the Internet via documented APIs.

Moreover, conversational interfaces are designed to adapt to users, and therefore to existing solutions.

  • Professional collaborative applications such as MS Teams, Slack, and Google Workspace,

  • Bot integrated into web interfaces or mobile applications,
  • Consumer messaging applications such as WhatsApp, Messenger or SMS.

The cost of a project with Agora Software breaks down into an initial set-up fee and an annual subscription fee.

Implementation costs are calculated based on the number of working days required.

Finally, the subscription fee is fixed per user or customer, and of course, we’d be delighted to tell you more.

So please don’t hesitate to give us a brief description of your requirements at: contact@agora.software

Want to understand how our conversational AI platform optimizes your users’ productivity and engagement by effectively complementing your enterprise applications?