How we use AI
AI and Large Language Models do not just speed up integration development - they revolutionize it.
Building integrations for your product is time-consuming and tedious process. For each application you integrate with you need to:
- Research its capabilities and data model.
- Get access to API.
- Build the integration logic.
- Build UI to enable and configure the integration.
Depending on your use case, each integration takes from days to months of engineering time.
At the same time, integrations are so important for the business that biggest SaaS companies built hundreds or thousands of integrations over years. Integrations became an important part of their competitive moat. It is hard and expensive for a new SaaS to catch up. Or it used to be until AI came in.
AI is great at identifying and applying patterns, and building hundreds of integrations is not more than identifying and applying patterns hundreds of times. With AI, most of it could be automated, democratizing the integration building process, and allowing every SaaS to have hundreds and thousands of integrations without spending years and millions of dollars on R&D.
Here is how it works.
Step 1: AI reads API documentation and creates an API specification
Most applications you want to integrate with have API documentation. But it is written for humans, so it cannot be used automatically. This is where LLMs come in.
When used correctly, they can read API documentation, converting it into a machine-readable API specification. For example, here is how one of the API methods for the most common integration - Slack - looks when converted into specification.
When you have API specification, you can start automating building integrations, but there is another step.
Step 2: AI creates a connector based on API Specification
API specification describes what a given application can do and how to make requests to it, but it is not enough to automatically build integrations for two reasons:
- There are different types of APIs. The one on the picture above is REST API described by OpenAPI specification. There are GraphQL APIs, XMLRPC APIs, etc. You need a way to work across them.
- API specifications do not convey meaning. You can know that there is a “Create an issue” API method, but you don’t know that it’s the same as “Create a task” in another API or that “title” field is where the task name goes. You need another level of meaning on top of the API specification.
Connectors solve both of these problems.
They normalize different APIs to the same set of concepts:
Additionally, they provide a layer of universalization on top of the entities by mapping common cross-application concepts like Universal Data Models to each application’s API.
How does AI come in here? AI reads API specifications and maps them to these universal concepts automatically (with some help from humans when needed).
Now we can automate the process of building your integrations.
Step 3: AI maps your integration scenario to connector
Now when your app wants to do something with any number of external apps, the process of building the integration can be automated.
Let’s say your app collects meeting notes and wants to send them to external application your customer chooses. Connectors created in the previous step let automatically map “Note” to a corresponding data object in each application, as well as “Create” operation to a corresponding API call.
When connector does not have a pre-mapped “Note” object type and “Create” operation, AI can still guess what “Push a Meeting Note” is for a given app by looking at its API Specification. In those cases, of course, a developer needs to confirm that AI did not do anything stupid.
Join the Integration Revolution
While AI will revolutionize product integrations, you still need a platform to leverage its capabilities. SaaS apps that use AI it will end up integrated with every other app their customers use. SaaS apps that don’t will end up isolated and bleeding customers.
You can start building your integrations with AI today by signing up to our platform.