Just like the cloud, at your edge

Delightfully easy container application management at
the edge

Applications are placed at the edge to be outage-resilient, and provide blazing fast data processing. Now is the time to seamlessly extend deployment and monitoring capabilities to the edge.
Request Free Trial

Customer testimonial

Flasheye: From standalone product to scalable software-as-a-service


Extenda Retail ushers in a new era of in-store software with Avassa’s edge platform

We’re live

Avassa at Edge Field Day 1

The first-ever Edge Field Day is finally here and Avassa is first up. Jump straight into the live stream to watch a jam-packed demo.

In this demo, we follow a platform engineer and an application developer on their quest to orchestrate and monitor container applications running on on-site edge infrastructure.


Apply a cloud operating model to edge applications

With the ability to use the operational technologies built for a modern cloud environment, you won’t waste time reinventing trusted tools and processes also for the edge.

Avassa bridges the gap between modern containerized applications development and operations and the distributed infrastructure. By embracing automation across how you develop, test, and release applications, you can add new features and put them into production at the edge quicker than ever.

why edge?

There is no turning back.
You need to move at the speed of software.

Enterprises expect feature velocity according to expectations set by the central cloud. While you may have moved significant parts of your applications to one of more public clouds, many of your application components will run much better at the edge. This, however, should not limit your ability or speed for innovation. With Avassa, you can increase efficiency, and lower your overhead costs related to edge application management, by extending the central cloud tooling to the edge.


Be up and running faster than you can say edge cloud orchestration

You set up new sites by installing the Avassa Edge Enforcer agent on each of the hosts making up a site. The Edge Enforcer is a container application that provides features related to the management of the host-local container runtime, including call-home, cluster formation and container scheduling.

Installing the Edge Enforcer is a simple operation based on a zero-trust architecture that does not require host-specific configuration. Edge Enforcers call home to their configured Control Towers and receive all necessary configuration including keying material for site- and tenant-level encryption. The basic initial setup of Edge Enforcers can easily be automated across large numbers of sites using deployment tools like RedHat Ansible or HashiCorp Terraform.

You declaratively describe containerized applications using application specifications that define sets of containers that run together as one versioned unit. Application specifications are in YAML or JSON format so that you can manage them as code or through the Control Tower web interface.

You describe which sites an application will be deployed on using application deployment specifications. Application deployments use a simple matching language to define under which circumstances (e.g. geographical, hardware features, available resources) an application should be started, and how the application should be rolled out across the matched sites.

After applications are deployed you upgrade them and change which sites they run on by simply changing the application and deployment configuration. The platform figures out the detailed changes needed and performs them with minimal impact.

Your newly started applications need to be monitored for health and at times provide observability to detect and resolve the underlying causes of any operational issues.

The runtime status of all containers belonging to a deployment is automatically monitored both from a container runtime perspective as well as using optional health checks for application-specific commands. Any operational issues are reported from an application-centric perspective, correlating the issue with which container, belonging to which application, and in which site.

For observability, you use distributed log queries to fetch historical logs, or stream current logs from your distributed locations. You filter based on time, log content, location and many other aspects to only get the logging data relevant to your current operational issue.


Opinionated and user-friendly

We don’t overcomplicate things. With Avassa, it’s easy to deploy containerized applications across large-scale edge clouds in a few clicks.

Distributed security
you can trust

We built our platform on zero-trust with a layered security model across sites and tenants, and eliminated any time-consuming manual work.

Application-centric operations

Our platform was built with the needs of application operations at the forefront, freeing you from the massive complexities associated with infrastructure-centric solutions.

Our solution works well with

Recent resources



Looking for our platform documentation?

Deepdive into the details of how our solution works in our platform documentation.


Don’t just take our word for it


Sign up for our newsletter

We’ll send you occasional emails, and you can opt-out at any time.
By clicking the button you agree to our Privacy Policy.


We are Avassa

Avassa empowers businesses to bridge the gap between modern containerized applications development and operations and the distributed edge infrastructure. Based in Stockholm, Sweden, Avassa was founded in 2020 and is a privately held company funded by Fairpoint Capital and Industrifonden.