Stakpak is now in Continue

Stakpak + Continue: a DevOps Super Power!

MCP: Connecting LLM apps to the external world

The Model Context Protocol is a big advancement in how AI systems interact with external tools and data sources. I personally find MCP to be transformative for creating more capable AI assistants.  The Model Context Protocol (MCP) serves as a standardized framework designed to connect AI apps with various external tools and data sources. Think of it as a “USB-C port for AI applications” – a universal connector that simplifies integration across multiple systems.

Traditional AI integration involves maintaining separate API connectors for each external tool, each requiring unique code, documentation, authentication methods, and error handling. MCP simplifies this complexity by providing a standardized interface.

MCP is a fundamental shift in how AI systems interact with external resources. It enables AI apps (MCP clients) to dynamically discover and engage with available tools without hardcoding each integration, and facilitates bidirectional, real-time communication similar to RPCs. This creates a more flexible, responsive system compared to traditional architectures.

MCP’s architecture consists of three primary components operating in a hierarchical relationship. At its core, MCP follows a client-server architecture where a host application can connect to multiple servers.

  1. MCP Hosts: Any AI applications like Claude Desktop, Cursor, or AI-powered IDEs that require access to external data or tools, and runs the MCP Client.
  2. MCP Clients: Establish dedicated one-to-one connections with MCP servers.
  3. MCP Servers: Lightweight servers providing specific functionalities through MCP, providing access to external data sources and capabilities.

Why this setup is powerful

In a traditional direct API integration, parameters are rigid. Imagine you’re building a chatbot that helps users book flights. Initially, your flight API requires two parameters: departure city and destination city. Developers integrating your API know that their requests must always include these two fields.

Later, you decide to improve your service by adding a seat preference parameter (e.g., window or aisle). Now, every application using your API must be updated to include this new field. If they don’t, their requests might fail, return errors, or produce incomplete responses.

MCP eliminates this rigidity by introducing a more dynamic and adaptive communication model. Instead of assuming fixed request structures, an MCP client (like your chatbot) first queries the MCP server (your flight booking system) to understand its available capabilities. The server responds with details about the required parameters—initially just departure and destination.

When seat preference is later added as a parameter, the next time the client queries the server, it automatically receives the updated list of capabilities and adjusts its requests accordingly—without requiring a code changes or redeployment.

This flexibility ensures that integrations remain seamless, even as APIs evolve. Instead of breaking functionality or requiring developers to scramble for updates, MCP allows applications to adapt dynamically to new features.

Continue: Custom AI Coding Assistant Every Dev Needs

Continue is a customizable, open-source AI-powered coding assistant designed to enhance the software development process. It works with popular IDEs like VSCode and JetBrains to amplify the work of experienced developers and fits perfectly the suite of tools driving AI native development. Continue has everything you’d expect from a coding assistant:

  1. Chat Interface: Engage in real-time conversations with the AI to understand, debug, and iterate on your code directly within the IDE.
  2. Autocomplete: Receive inline code suggestions as you type, facilitating faster coding across various programming languages.
  3. Code Editing: Modify existing code by providing natural language instructions, allowing for efficient refactoring and updates without leaving your current file.
  4. Custom Actions: Set up shortcuts for common tasks, streamlining repetitive processes and enhancing workflow efficiency.
  5. Model Flexibility: Continue supports various AI models (e.g., GPT-4, Claude) and even allows integration with local models for enhanced customization.
  6. Context Providers: Developers can add context providers to enrich the AI’s understanding of their project by incorporating the latest documentation or external resources.

But unlike your average coding assistant, Continue can be tailored to fit different domains and needs including DevOps with lego-like building blocks. It’s super customizable and has a marketplace to share your own Coding Assistants and building blocks.

Why Stakpak + Continue is a DevOps Super Power

Stakpak partnered with Continue to enhance the infrastructure as code generated (and possibly give DevOps the fun it deserves). Stakpak makes coding assistants better at DevOps. LLMs are impressive at coding, but they suck at DevOps tasks especially generating valid configurations, Stakpak makes generating and modifying infrastructure configurations as easy as ordering coffee (but without the barista spelling your name wrong). It ensures accuracy without sacrificing usability, supporting even the most complex projects and is trusted by developers worldwide for building production-ready infrastructure quickly. This combination addresses two critical pain points in modern DevOps:

1. Reliable IaC Generation: Stakpak has 95% one-shot validity for Terraform configurations (1,900/2000 generations passing syntax and schema validation). This reliability stems from context-aware code synthesis using provider docs and real-time schema validation.
2. Integrated Development Flow: Thanks to the Model Context Protocol (MCP), Continue gets direct access to Stakpak’s specialized DevOps capabilities, allowing developers to generate infrastructure code on the fly with confidence—whether it’s Terraform, Kubernetes, or just a Dockerfile.

Continue in action calling Stakpak

Currently, Stakpak in Continue supports only IaC generation, but we’re working on expanding its capabilities. Soon, through MCP, Continue users will be able to leverage Stakpak’s Terminator-like DevOps agents that never stop until they debug your cloud, or dockerize and deploy your apps. This means less time wrestling with deployment scripts and more time focusing on what matters—like finally fixing that one “temporary” workaround.

How to Set Up and Use Stakpak in Continue

Install continue extension on VS Code or JetBrains if you don’t have it already.

  1. Go to Continue Hub (https://hub.continue.dev/new) and create a new assistant.
  2. Scroll down to MCP Servers and click create.
  3. Add the MCP Server to the assistant.
  4. Add the following to your config.yaml (get your free Stakpak API Key from here Stakpak API Key):

					
				

Now you can generate your Infrastructure as Code using Stakpak.

This is just the beginning. As AI continues to reshape software development and DevOps, tools like Stakpak and Continue will lead the charge, making quick, production-ready infrastructure a reality. 🚀

Continue wrote their take about this partnership. Check out their blog post to learn more about custom AI Coding Assistants.

Share the Post:

Related Posts