The success of automation in the enterprise is dependent on the ability for workflows to be built by business users, as well as developers. This was front of mind when we built the A-Ops platform, emphasising creating an intuitive, drag-and-drop way to build automation workflows.
We’re always looking for ways to make A-Ops simpler to use. This includes evaluating how automation workflows are built in the platform which, to date, has required familiarity with integration objects and underlying logic..
Like many others, we’ve been experimenting with Large Language Models (LLMs) over recent months. With intensive prompt engineering, we’ve found that Open AI’s GPT-4 model can recommend steps, or ‘objects’, that make up an effective automation workflow from a natural language prompt setting out a desired output.
Naturally, we wanted to put this into the hands of our community. The potential for time saved when building workflows and lowering barriers to entry when using A-Ops is massive.
Templates toward success
That’s why we’re pleased to have rolled out a new TemplateAI integration to all A-Ops users, following its successful beta testing.
Here’s how it works. Users choose to add TemplateAI to their environment like any other integration. They can enter a prompt into TemplateAI, which GPT-4 takes and returns the YAML template that the AI estimates will deliver the desired automation. Users hit ‘create’ and have that converted into a drag-and-drop, building block-style workflow typical of our platform. The below video compares this AI-driven process with the typical process that our platform users have followed to build automation workflows to date. As you’ll see, the potential for time savings is massive.
A workflow that TemplateAI produces obviously shouldn’t be lifted and applied to a live environment with no questions asked. Troubleshooting and debugging are pivotal. But it does give users a starting template to build their desired automation that’s accurate and reliable most of the time and doesn’t touch their actual data.
While there are already online services that use generative AI to suggest designs for automation workflows from a prompt, TemplateAI is unique in that the workflow building blocks it renders are always made up of YAML code that’s easily viewed and understood. Alternative services mask complex Python scripts with the appearance of simple building block steps to automation. That’s limiting to users who aren’t well-versed in Python.
Data privacy considerations
TemplateAI comes with terms of use. Any sensitive data a user wouldn’t share in the public domain shouldn’t be entered into the textbox. We’re fully transparent that we’ll use data from user TemplateAI logs to continually improve the feature for the benefit of our community. We strongly believe this is not a cause for concern just like you would download templates from community libraries templateAI should be treated in the same regard.
See it for yourself
Like any LLM application, the best way to test TemplateAI is to spend time experimenting with it on the A-Ops platform. Our Managed Automation-as-a-Service team is using it to drive efficiencies in their working practices.
We’ll likely add more AI integrations into A-Ops in the coming months. We’ll review the implications for security and privacy of each new feature in isolation, as they emerge, and always maintain transparency. Keep track of our LinkedIn feed for the latest updates on how we’re approaching integrations of AI into the platform.
Reach out to the team today to find out what A-Ops and TemplateAI could do for your team.