[ad_1]
· IaC Self-Service
· Excessive-Stage Deployment Diagram
· Overview of Pipelines
· Infrastructure Pipeline
∘ terraform-aws-modules
∘ Implementation Conditions
∘ Step 1: Create GitHub environments
∘ Step 2: Add infrastructure pipeline code
∘ Step 3: Add GitHub Actions workflow for infrastructure pipeline
∘ Step 4: Kick off infrastructure pipeline
· Utility Pipeline (CI/CD)
∘ Step 1: Containerize the app if it’s not but containerized
∘ Step 2: Add GitHub Actions workflow for CI/CD
∘ Step 3: Kick off CI/CD pipeline
∘ Step 4: Launch our RAGs app
· Destroy and Cleanup
· Key Finish-to-end Implementation Factors
· Abstract
LLM purposes, when developed to make use of third-party hosted LLMs similar to OpenAI, don’t require MLOps overhead. Such containerized LLM-powered apps or microservices could be deployed with DevOps practices. On this article, let’s discover learn how to deploy our LLM app to a cloud supplier similar to AWS, absolutely automated with infrastructure and software pipelines. LlamaIndex has a readily made RAGs chatbot for the neighborhood. Let’s use RAGs because the pattern app to deploy.
IaC, quick for Infrastructure as Code, automates infrastructure provisioning, making certain that configurations are constant and repeatable. There are various instruments to perform IaC. We are going to give attention to HashiCorp’s Terraform on this article.
The first objective of IaC self-service is to empower builders with extra entry, management, and possession over their pipelines to spice up productiveness.
For these , I wrote a 5-part collection on DevOps self-service mannequin a couple of yr in the past to element all facets associated to a DevOps self-service mannequin.
[ad_2]