Operations | Monitoring | ITSM | DevOps | Cloud

Linux

Secure your AI workloads with confidential VMs

AI models run on large amounts of good quality data, and when it comes to sensitive tasks like medical diagnosis or financial risk assessments, you need access to private data during both training and inference. When performing machine learning tasks in the cloud, enterprises are understandably concerned about data privacy as well as their model’s intellectual property. Additionally, stringent industry regulations often prohibit the sharing of such data.

Cron Jobs In Linux - How To Use Cron Jobs To Automate And Schedule Tasks

Cron is a job scheduling utility included in most Unix-like operating systems. It allows users to schedule and automate the execution of repetitive tasks at specific intervals. The crond daemon is the background process that enables cron functionality. It continuously runs in the background, checking for predefined scripts or commands to run in crontab files.

Linux CPU Utilization - How To Check Linux CPU Usage

CPU utilization is a crucial metric for measuring system performance and identifying potential bottlenecks in Linux systems. This article explores the concept of CPU utilization, factors contributing to high CPU usage, and various command-line tools and graphical utilities for monitoring and troubleshooting CPU utilization in Linux environments.

How should a great K8s distro feel? Try the new Canonical Kubernetes, now in beta

Kubernetes revolutionised container orchestration, allowing faster and more reliable application deployment and management. But even though it transformed the world of DevOps, it introduced new challenges around security maintenance, networking and application lifecycle management.

Ubuntu AI | S2E3 | GPU utilisation optimisation at KubeconEU 2024

Maciej is not only the host of our podcast, but also an experienced keynote speaker. After a joint keynote at KubeconEU 2023 about highly sensitive data, in 2024, Maciej goes to Paris to talk about the GPU utilisation. During our podcast, we cover a lot of aspects of GPU utilisation. From best practices to existing tooling, there are different angles that Maciej talk about, giving a sneak-peak into his keynote. Are you curious how open source tooling plays a role in optimising the GPU utilisation? Listen to our podcast!

Large Language Models (LLMs) Retrieval Augmented Generation (RAG) using Charmed OpenSearch

Large Language Models (LLMs) fall under the category of Generative AI (GenAI), an artificial intelligence type that produces content based on user-defined context. These models undergo training using an extensive dataset composed of trillions of combinations of words from natural language, enabling them to empower interactive and conversational applications across various scenarios.