Operations | Monitoring | ITSM | DevOps | Cloud

Michael Burry Warns of Artificially Inflated Earnings

On November 10, 2025, Michael Burry, the investor famous for predicting the 2008 subprime mortgage crisis and featured in the film "The Big Short," posted on X, accusing American big tech giants of inflating their earnings. The criticism centers on a widespread accounting practice among companies that have invested in AI: the artificial extension of the useful life of IT equipment, primarily Nvidia GPUs, to mitigate the impact of depreciation on corporate balance sheets.

Agentic AI Essentials: Your Guide to the Future of Automation

To mark the launch, we’re publishing Agentic AI Essentials, a four-part series to help organizations navigate the reality of agentic AI adoption. Across the series, we’ll look at the questions that matter most: what’s real versus hype, how to avoid adoption pitfalls, how to measure ROI, and how roles will evolve once agents are onboarded. Here’s a sneak peek at what’s in store.

How companies are using Civo GPUs to accelerate AI innovation without runaway costs

Accessing high-performance GPUs shouldn’t feel like a bottleneck. Yet, as AI adoption accelerates, many teams are discovering that hyperscaler offerings often come with a hidden price: long wait times, opaque billing, and layers of unnecessary complexity. At Civo, we’ve seen a different way. Our GPUs enable companies to move faster while keeping infrastructure overhead and costs firmly under control.

How to Ensure AI-Generated Code is Reliable with Runtime Context

TLDR: AI coding assistants have sped up code delivery, but created a validation gap. Historic telemetry and static analysis cannot predict the behavior of unfamiliar, high-volume code. Lightrun’s Runtime Context MCP closes that gap, allowing AI assistants to verify behavior before it breaks, and resolve issues in real time.

Beyond the Hype: Building a Future-Proof Foundation for the AI-Native Enterprise

We are witnessing a fundamental transformation in how software is built. The industry has moved beyond the experimental phase of Machine Learning Operations and entered a complex new reality: the era of the AI Software Supply Chain. The adoption metrics confirm this shift is irreversible. Google reports that 90% of tech workers are now using AI as part of their daily work. Similarly, McKinsey data reveals that 88% of organizations use AI in at least one business function.

Build custom apps in seconds with conversational AI in App Builder

Using a drag-and-drop interface, engineering teams can create apps that support troubleshooting, improve day-to-day operations, and offer self-service access without leaving Datadog. With the new conversational AI feature, teams can turn an idea into a working app in seconds. Watch the video to see how it works..

Poisoning the Well: The Invisible Danger in Your AI Supply Chain

Welcome to the AI research bites. This series of short and informative talks showcases cutting-edge research work from ServiceNow AI Research team. The AI Research Bites are open to all, especially those interested in keeping up with the fast-paced AI research community.

Finetuning Gemma 3 on private data with Unsloth and CircleCI

Fine-tuning Large Language Models (LLMs) on private, domain-specific data can unlock significant value for your specific use case. When done correctly, you can create AI apps that understand your organization’s unique context. These apps can speak your brand’s voice and deliver remarkably accurate results that general models cannot match. However, finetuning is not always the right solution. Many teams rush into this complex technique without exploring simpler alternatives first.