Operations | Monitoring | ITSM | DevOps | Cloud

The latest News and Information on DevOps, CI/CD, Automation and related technologies.

Cloud threat detection and response

Google Security Command Center (SCC) Enterprise is the industry’s first cloud risk management solution that fuses cloud security and enterprise security operations - supercharged by Mandiant expertise and AI at Google scale. Watch and learn how to detect threats to your cloud resources and automate attack response.

Changelog Breakdown: Focus Tab, GitKraken.dev, & more

Dive into the latest GitKraken Client updates – starting with Focus View, helping you prioritize all PRs, Issues, and WIPs so you waste less time wondering, "What's next?" and more time coding. Worried about security? We've got new customizable protections to ensure that your work (and your mind) stays safe and at ease. Whether you're managing your Workspaces or sharing code with Cloud Patches, GitKraken brings everything you care about into one accessible, secure, and efficient place.

Ubuntu AI | S2E3 | GPU utilisation optimisation at KubeconEU 2024

Maciej is not only the host of our podcast, but also an experienced keynote speaker. After a joint keynote at KubeconEU 2023 about highly sensitive data, in 2024, Maciej goes to Paris to talk about the GPU utilisation. During our podcast, we cover a lot of aspects of GPU utilisation. From best practices to existing tooling, there are different angles that Maciej talk about, giving a sneak-peak into his keynote. Are you curious how open source tooling plays a role in optimising the GPU utilisation? Listen to our podcast!

Advice for building an incident management program

On this weeks' episode of The Debrief, we chatted with Jeff Forde, an Architect on the Platform Engineering team at Collectors. With a background spanning finance, healthcare, and various product-led startups, Forde has honed his expertise in DevOps, site reliability, and platform engineering. Beyond his professional life, he's also a dedicated volunteer first responder and certified fire instructor in Connecticut, offering him a unique perspective on managing incidents of all typesz.

Azure Cost Management and FinOps: Lessons from the Frontlines

Azure Cost Management and FinOps: Lessons from the Frontlines This episode of "FinOps on Azure" dives into the crucial issue of managing Azure costs effectively. It addresses the common challenges faced by organizations in controlling their Azure spending and offers insights and strategies to prevent unexpected overspending. Through real-world experiences shared by Saravana Kumar, CEO of Kovai.co, viewers can gain valuable lessons on optimizing Azure consumption and establishing robust cost governance practices.

The Future Of Cloud Cost Management: AI And Machine Learning On AWS

As organizations increasingly migrate to the cloud, managing expenses efficiently becomes crucial. Traditional cost management methodsoften fall short in this environment, where resource allocation and usage can fluctuate dramatically. Enter Artificial Intelligence (AI) and Machine Learning (ML). These cutting-edge technologies are revolutionizing the way businesses approach cloud cost management.

Qovery is Now Available on the AWS Marketplace

I'm thrilled to announce the availability of Qovery on the AWS Marketplace. You can now buy and benefit from Qovery right from the AWS Marketplace. Before delving into the specific advantages of purchasing Qovery through the AWS Marketplace, let's first understand what the AWS Marketplace is and why this is something you should consider when purchasing Qovery.

Automating Azure Cloud Unit Economics Generation: The Turbo360 Advantage

The scalability of the cloud and its inherent variable cost has created financial and operational challenges, which demand the process of tracking varying costs in the dynamic Azure infrastructure at a granular business context level. Unit economics is a process of profit maximization in the cloud based on objective measurements like cost per product. This approach assesses how the organization is performing against its business goals.

Large Language Models (LLMs) Retrieval Augmented Generation (RAG) using Charmed OpenSearch

Large Language Models (LLMs) fall under the category of Generative AI (GenAI), an artificial intelligence type that produces content based on user-defined context. These models undergo training using an extensive dataset composed of trillions of combinations of words from natural language, enabling them to empower interactive and conversational applications across various scenarios.