Operations | Monitoring | ITSM | DevOps | Cloud

AI

How to benchmark your database management system | Data & AI Masters | Canonical

In this talk Mohamed Nsiri will provide a step-by-step guide on how to perform a benchmark of a given database management system configuration. We will explain how to assess the performance impact of a change (in memory, cpu speed, etc.) and how to universally compare different setups. The workshop will cover various factors that can significantly impact the performances of your databases including number of concurrent users, workload pattern and more.

The Power of Inclusive AI

The importance of artificial intelligence in driving business success has never been clearer. Yet, for too long advanced AI capabilities have been the preserve of tech giants and well-funded enterprises. This concentration of resources has created inequality that threatens to leave many businesses behind. But the importance of democratizing AI access extends far beyond individual company success - it's about fostering a vibrant, competitive ecosystem where innovations can emerge from unexpected places.

AWS Bedrock Pricing: Your 2024 Guide to Amazon Bedrock Costs

The future is AI. That’s a fact, and all the major cloud corporations are taking notice and investing in generative AI offerings to serve their customers better. Microsoft Azure has invested in OpenAI‘s ChatGPT, Google has Vertex AI, and Amazon has created Bedrock. But what exactly is AWS Bedrock? And, most importantly, how much will it cost? Will this generative AI be an easy investment, or will you have to break the budget to squeeze it in?

Lumigo Copilot Beta Demo | AI-Powered Observability in Action

Discover how Lumigo Copilot transforms troubleshooting and observability with the power of AI. In this demo, we’ll showcase how Lumigo Copilot: Whether you're a senior developer or just starting out, Lumigo Copilot makes debugging smarter, faster, and more intuitive. Try Lumigo Copilot today: lumigo.io Subscribe for more product demos, tips, and insights on modern observability.

Creating Realistic 3D Models With AI-Generated Textures and Materials

The creation of realistic 3D models has always been a challenge for designers, requiring meticulous attention to detail in texturing and material application. In recent years, the advent of artificial intelligence (AI) has introduced transformative methods that enhance realism and simplify the process. Among its applications, AI-generated textures and materials stand out for their ability to mimic intricate natural and synthetic surfaces, making it easier to produce photorealistic visuals.

Building RAG with enterprise open source AI infrastructure

One of the most critical gaps in traditional Large Language Models (LLMs) is that they rely on static knowledge already contained within them. Basically, they might be very good at understanding and responding to prompts, but they often fall short in providing current or highly specific information.

LLM Testing in 2025: Methods and Strategies

Large Language Models, or LLMs, have become a near-ubiquitous technology in recent years. Promising the ability to generate human-like content with simple and direct prompts, LLMs have been integrated across a diverse array of systems, purposes, and functions, including content generation, image identification and curation, and even heuristics-based performance testing for APIs and other software components.

Three benefits of AI-Powered Incident Management

Today, every enterprise is digital. Regardless of industry, every business must incorporate digital technologies and strategies into its operations to remain competitive. Maintaining reliable IT infrastructures and digital services while minimizing downtime due to unplanned outages is critical to business success.

Engineering AI systems with Model Context Protocol

On November 26, 2024, Anthropic released the Model Context Protocol (MCP)—an open standard for data exchange between applications and data sources. MCP simplifies how Large Language Models (LLMs) interact with external tools and data, addressing the challenges developers face when integrating AI into their systems. At Raygun, we’ve been exploring agentic workflows to improve productivity and saw real potential in MCP.

Grafana LLM plugin updates: choose the LLM models and providers that work best for you

At Grafana Labs, our mission has always been to empower users with the tools they need to build their own observability solutions. Our big tent philosophy embodies this mission by allowing you to choose the tools and technologies that best suit your needs. In this post, we want to share an update to our LLM plugin that reflects this philosophy in action.