Copado today announced it has expanded a beta program through which it provides application development teams building applications for the Salesforce software-as-a-service (SaaS) platform access to generative artificial intelligence (AI) capabilities enabled by ChatGPT.
David Brooks, SVP of Evangelism for Copado, said the company, via the usage of vector databases, has exposed instances of ChatGPT to everything in its knowledge base that has been collected over the last 10 years. That approach enables Copado to extend the large language model (LLM) used to train ChatGPT without exposing any data that might be used to one day train the next version of the generative AI platform.
The goal is to enable DevOps teams to generate code and tests faster while simultaneously reducing the time required to handle support calls, said Brooks.
In addition, Copado is using ChatGPT to enable impact analysis and conflict detection to reduce the level of friction encountered when deploying applications in a production environment. That latter capability is critical because, in many instances today, DevOps teams don’t have a firm understanding of how code needs to be deployed, resulting in unnecessary rollbacks of application deployments, noted Brooks.
Generative AI will also automate the creation of documentation, such as release notes for every application deployed, to reduce the overall level of toil DevOps teams experience, noted Brooks.
Copado also plans to provide governance tools that will enable DevOps teams to ensure the code written by an LLM is not based on a hallucination that results when trying to reconcile conflicting data, added Brooks. There will, of course, always be a need for a human to review code before it is deployed in a production environment, he added.
In the longer term, Copado will make it possible for organizations to invoke multiple LLMs to address various functions as they best see fit, said Brooks. In fact, DevOps teams should expect to soon be employing multiple copilots trained to address specific tasks such as writing code, deploying applications, assessing cybersecurity and addressing compliance mandates, he added.
Ultimately, application development and deployment efforts that today require weeks to complete will soon be accomplished within the same day, said Brooks.
At the same time, organizations will also be able to better manage application environments that today operate at levels of scale few IT teams can manually track. Generative AI capabilities, in effect, will make it simpler for DevOps teams to effectively manage application environments at much higher levels of scale, said Brooks.
Each organization will naturally need to determine to what degree they are prepared to absorb that level of innovation, but it’s clear the pace at which applications are built and deployed is about to exponentially increase. The issue organizations will increasingly encounter is that business processes will not be able to absorb the rate at which software can be built and updated.
It’s not clear just yet to what extent generative AI will transform DevOps, but as advances continue to be made, many of the manual tasks and bottlenecks that today conspire to slow down application development and deployment are about to be eliminated in ways that few would have thought possible a little more than a year ago.