Operations | Monitoring | ITSM | DevOps | Cloud

Latest News

Control the complexities of containers with the Ocean Suite for Kubernetes

In the relatively short window of time that Kubernetes has been around, it’s rapidly matured as a critical technology foundation for the cloud, and now even applications that were previously thought to be unviable for containers are running with Kubernetes. As companies expand their usage of it, the need to operationalize Kubernetes with automation and optimization is critical to maintaining speed, agility and control in the long-term.

GCP Integrations for Metrics with Logz.io

Logz.io has dedicated itself to encouraging and supporting cloud-native development. That has meant doubling down on support for AWS and Azure, but also increasing our tie-ins with Google Cloud Platform – GCP. Recently, our team added dozens of new integrations for metrics covering the gamut of products in the GCP ecosystem.

Do you need a business case to migrate to the cloud? The answer is clear!

Summary The cloud is always innovating. One of the more recent and large breakthroughs has been the advancement and improvements in CPU architectures. Specifically with ARM CPU processors, where we are seeing adoption across all forms of computing, not only cloud, but also laptops with Apple’s M1, and of course in the past decade with mobile phones. The more recent availability in cloud computing therefore is not surprising, given the progress made in all other areas of technology.

5 Best Practices for Successful Microservices Implementation

Microservices have significantly altered the architecture of server-side processors. Rather than a single massive monolithic codebase containing all of your application’s business logic, microservices adhere to the distributed systems concept, in which a collection of application components collaborate to meet business goals. You may create a streamlined microservices ecosystem free of superfluous architectural complications by adhering to microservices industry standards.

Announcing support for Windows containers on AWS Fargate

AWS Fargate is a serverless compute engine that allows you to deploy containerized applications with services such as Amazon ECS without needing to manage the underlying virtual machines. Deploying with Fargate removes operational overhead and lowers costs by enabling your infrastructure to dynamically scale to meet demand. We are proud to partner with AWS for its launch of support for AWS Fargate on Windows containers.

How to rename an API Connection thru the Azure Portal

Starting building Logic Apps thru the Azure Portal is, without a doubt, the most intuitive and fast approach. It doesn’t require any additional tool or software, it doesn’t require a Visual Studio license, and almost all beginner tutorials or documentation use this approach. And finally, all types of users are familiar with the Azure Portal. But not everything is perfect, and one of the most difficult best practices to implement is a proper API Connection naming convention.

Ocean Headroom Explained - Launch pods without delay!

The dynamic nature of cloud native applications is both a blessing and a curse. The ability to use compute, storage, and network resources without managing physical hardware is a real blessing. Your applications can take advantage of the seemingly limitless resources available in the public cloud. Unfortunately, the curse becomes clear when the bill arrives! It is a significant CloudOps challenge to find the balance between providing optimal application performance and minimizing cost.

Announcing support for Graviton2-powered AWS Fargate deployments

AWS Fargate is a serverless compute engine that allows you to deploy containerized applications on services like Amazon ECS without needing to provision or manage compute resources. Now, Datadog is proud to be a launch partner with Amazon for their support of AWS Fargate workloads running on Graviton2, Amazon’s proprietary ARM64 processor.