Encores omslagsbild
Encore

Encore

Programutveckling

DevOps Automation for your cloud on AWS & GCP

Om oss

Encore Cloud is the most productive way to build using AWS & Google Cloud. Encore provisions and manages infrastructure in your cloud environments, giving developers the speed of self-serve development without compromising on security or compliance. Trusted by companies from day 1 to the Nasdaq, including: Groupon, Echo (Acquired by Coinbase), Carla, Bookshop.org, Pave Bank, and 100+ more. Encore is built for AI-era development: - Don't stop and wait to set up infra, let your AI tool define infra in code and Encore provisions it automatically (with guardrails, observability, and audit logs). - Automatic validation catches mistakes in AI-generated code as compilation errors. - Preserve human knowledge with auto-generated docs, service catalogs, and architecture diagrams. Real impact: - Teams report 2-3x faster development, 90% shorter project lead times, and 93% less time spent on DevOps work. Start small with new services, then migrate at your own pace. Learn more: https://encore.cloud

Webbplats
https://encore.cloud
Bransch
Programutveckling
Företagsstorlek
11–50 anställda
Huvudkontor
Stockholm
Typ
Privatägt företag
Grundat
2020

Adresser

Anställda på Encore

Uppdateringar

  • Encore omdelade detta

    Visa profilen för Marcus Kohlberg

    Encore13 tn följare

    Infrastructure as code was a 2014 idea, and 2026 is the last year of Terraform. You write YAML or HCL that describes infra, then a tool reconciles it. It assumed a human was the source of truth. That assumption breaks when an AI agent is writing the code. An agent let loose on Terraform doesn't know which IAM role to reuse, or which firewall rule to remove as it's no longer needed. You end up with infra sprawl, permission drift, and a security review queue that never empties. Infrastructure FROM code flips the model. The application code is the source of truth. A compiler reads the code, understands the intent (this is a Pub/Sub topic, this service calls that one, this endpoint needs auth), and derives the infra deterministically. No YAML. No drift. No hallucinated modules. We've been building toward this at Encore for years. What we didn't predict in 2021 is how well it would fit the agentic era. Turns out a compiler is a very good guardrail for an AI writing backend code. Fewer decisions for the agent to get wrong. Fewer tokens burned on boilerplate. Reviewable, auditable output by default. The AI era doesn't need more config files. It needs fewer. More on the blog: https://lnkd.in/ebe_GgmK

    • Ingen alternativ bildtext i den här bilden
  • Encore omdelade detta

    One of the more interesting problems we've been working through internally: what does development infrastructure look like when AI agents are doing meaningful portions of the work? The current workflow most teams are landing on is messy. Agents run locally, spin up multiple workspaces, execute changes against mocked or partially real environments, and hand off to humans for review. It works, but it doesn't scale. Setup times are long, the environments aren't real (in terms of matching production infrastructure), and by the time a human reviews the output there's no reliable way to validate the changes without deploying to real cloud infrastructure. The model we're working toward is different: cloud sandbox environments where agents can iterate against real infrastructure, with full tracing and observability from the start. The output of an agent session isn't just code, it's rich validation based on running a full-stack application on real infrastructure, walking through the changes end to end and verifying the result using both tracing and UI snapshots. Real evidence of correctness, not just a code diff to review. The other dimension is scale. If you can spin up thousands of real, fully-traced cloud environment in seconds, the economics of parallel agentic work changes completely. Instead of a handful of agent sessions per engineer, your team is running thousands of agents in parallel, each in their own sandboxed environment, with the results collapsing into validated, deployable output. This is early and we're still figuring out the right shape of it. But the direction feels right: agents need real infrastructure, real observability, and fast environments. That's what we're building toward at Encore.

  • Encore omdelade detta

    "AI-native" has become one of those terms that gets attached to everything and means nothing. Here's what it means to us in practice, from something we just shipped. We added a "Copy LLM prompt" button to the Encore local dev dashboard. When you click it, it generates a prompt that will leverage Encore's built-in MCP to capture full system context: the service topology, the live request traces, the API structure, the infrastructure dependencies. Ready to paste directly into any AI coding tool. It sounds simple. But the reason it works is because Encore already knows your entire system at the framework level. It's not scraping your codebase or inferring structure from comments. The full picture of your application; every service, every API endpoint, every infrastructure dependency, is captured declaratively in the source code. That same structured representation is what makes AI agents dramatically more effective when working on Encore apps. Instead of spending tokens reading through Terraform configs, scattered YAML files, and CI pipelines, an agent can get complete system understanding from the structured metadata Encore maintains automatically. In practice this means agents working on Encore codebases use around 75% fewer tokens than on traditional toolchains. AI-native isn't a UI feature. It's a fundamental architectural property. If your system's structure isn't explicitly represented in a way AI can consume directly, you're not AI-native, you're just AI-adjacent.

    • Ingen alternativ bildtext i den här bilden
  • Encore omdelade detta

    Visa profilen för Marcus Kohlberg

    Encore13 tn följare

    The best thing about having engineers as customers is that they help you make your product better. Simon Vans-Colina is CTO of Pave Bank, a long-time Encore customer. He and his team are the most pioneering developers in fintech, and are leveraging AI to build out an entirely new type of bank. His feedback has been the trigger for lots of innovation at Encore, including adding built-in MCP for all apps built with Encore very early on. This means when his team is building Pave, their agents can: - Autonomously introspect and interact with their locally dev environments - Get answers for architecture, service and API surface area - Call API's and and view traces - And more... This approach radically shortens iterations and cuts unnecessary token spend. His team is actively hiring, so if you like Go and want to join one of the most forward leaning teams out there, hit him up.

  • You can now debug backend requests directly from your frontend. Drop a script tag into your app and get: - Traces for every API call, linked directly from your frontend - Backend logs for each request - One-click LLM prompts to debug any request with Encore MCP Works locally and in deployed environments.

  • We're hiring the top 0.0001% engineers to help us build the development platform for the AI era. We've grown more in the last few months than in the previous four years combined, with thousands of developers using the framework and billions of requests processed by applications built on Encore. If making Encore the default way teams build and ship in the cloud sounds interesting, link below.

    • Ingen alternativ bildtext i den här bilden
  • We wrote 60,000 lines of Rust to power a TypeScript framework, and are now sharing what that journey actually looked like. The runtime handles everything underneath your application code - HTTP, databases, pub/sub, tracing, caching, API gateway - all running multi-threaded in the same process as Node.js. Give it a read: https://lnkd.in/dd-bwM-J

Liknande sidor

Finansiering

Encore 2 rundor totalt

Senaste finansieringsrunda

Startkapital

2 698 575,00 US$

Se mer info på crunchbase