Operations | Monitoring | ITSM | DevOps | Cloud

Latest News

Building a GPT-style Assistant for historical incident analysis

Like most things, our AI Assistant started out as an idea. One of our data scientists, Ed, was working with our customers to improve our existing insights. But the most common theme that kept surfacing was the wide-range of use cases that our customers wanted to use insights for. Using this user feedback as our inspiration, we came up with the idea of a natural language assistant that you can use to explore your incident data.

The Debrief: incident.io, say hello to AI

This week was a particularly exciting one for us at incident.io. We launched not one, not two, but four AI-powered features to help folks get the most out of their incidents. In this episode of The Debrief, we sit down with Ed Dean, Product Analyst, and Charlie Revett, Product Engineer, to talk through all of these features and discuss how they're already making a measurable impact. You'll also hear them talk about: You can learn more about our AI features here.

Lessons learned from building our first AI product

Since the advent of ChatGPT, companies have been racing to build AI features into their product. Previously, if you wanted AI features you needed to hire a team of specialists to build machine learning models in-house. But now that OpenAI’s models are an API call away, the investment required to build shiny AI has never been lower. We were one of those companies. Here’s our journey to building our first AI feature, and some practical advice if you’ll be doing the same.

Make AI Writing Undetectable with These Helpful Tips

Artificial Intelligence (AI) has revolutionized the writing field, proving itself a capable author of anything from news articles to short stories. However, one of the common challenges people face is making AI-generated text sound as human and natural as possible. Namely, AI-generated text can be identified by its lack of personal touch, and writers need to create their content that could effectively engage and entice readers. So how can one make AI writing undetectable and incorporate it seamlessly into their work? Let's explore some helpful tips.

Unleashing the power of AI and automation for effective Cloud Cost Optimization in 2024

In the current dynamic business environment, cloud computing has emerged as the fundamental driver of innovation and scalability. As companies increasingly rely on the cloud for their business initiatives achieving cloud cost optimization remains a significant hurdle.

Supercharged with AI

One of the most painful parts of incident management is keeping on top of the many things that happen when you’re right in the middle of an incident. From figuring out and communicating what’s happening, to ensuring you learn from previous incidents, and even capturing the right actions – incidents are hard, but they don’t need to be this hard.

Challenges & limitations of LLM fine-tuning

Large Language Models (LLMs) like GPT-3 have revolutionized the field of artificial intelligence, offering unprecedented capabilities in natural language processing. Fine-tuning these models to specific tasks or datasets can enhance their performance. However, this process presents unique challenges and limitations that must be addressed. This article explores the intricacies of LLM fine-tuning, shedding light on the obstacles and constraints faced in this advanced AI domain.

Navigating AI in SOC

With notable advancements in Artificial Intelligence (AI) within cybersecurity, the prospect of a fully automated Security Operations Center (SOC) driven by AI is no longer a distant notion. This paradigm shift not only promises accelerated incident response times and a limited blast radius but also transforms the perception of cybersecurity from a deterrent to that of an innovation enabler.

Prompt engineering: A guide to improving LLM performance

Prompt engineering is the practice of crafting input queries or instructions to elicit more accurate and desirable outputs from large language models (LLMs). It is a crucial skill for working with artificial intelligence (AI) applications, helping developers achieve better results from language models. Prompt engineering involves strategically shaping input prompts, exploring the nuances of language, and experimenting with diverse prompts to fine-tune model output and address potential biases.

How AI can change the game in the database and streaming system optimization field

The net result of an AI-powered optimization process is not only a better data experience, but also increased developer productivity. Once the teams are aware of what can be automated, the improvements can not only speed things up but reduce costs.