Operations | Monitoring | ITSM | DevOps | Cloud

CircleCI

Integrating AI and DevOps for Software Development Teams

For a long time, the domains of Machine Learning and AI on one side, and software development on the other side, were separate kingdoms. Sometimes, they touched, and something magical would happen. But more often, things didn’t really work out. They faced challenges stemming from a lack of mutual understanding, shared language, and compatible tools. With the meteoric rise and increased accessibility of powerful generative AI and LLMs, the need for collaboration to achieve real-world engineering and customer value has never been more vital.

LLM hallucinations: How to detect and prevent them with CI

An LLM hallucination occurs when a large language model (LLM) generates a response that is either factually incorrect, nonsensical, or disconnected from the input prompt. Hallucinations are a byproduct of the probabilistic nature of language models, which generate responses based on patterns learned from vast datasets rather than factual understanding.

What is microservices architecture?

Microservices architecture is a method of developing software systems that structures an application as a collection of loosely coupled services, each focusing on a single function or business capability. Each service operates within a discrete, confined context, communicating with other services through well-defined interfaces — typically APIs.

Test-driven development (TDD) explained

Test-driven development (TDD) is a software development process that involves writing tests for your code before you write the code. This approach has transformed the development methodology around testing. While the traditional waterfall model of software development was linear, with testing occurring near the end of one long timeline, TDD makes testing an ongoing, iterative process.

What's in store for AI in 2024 with Patrick Debois

In this episode, Rob is joined by Patrick Debois, a seasoned industry expert and DevOps pioneer. Patrick shares his personal odyssey within the realm of DevOps, reflecting on the current state of the industry compared to his initial expectations. The conversation delves into the convergence of business analytics and technical analytics, exploring innovative approaches developers are adopting to integrate generative AI into their products.

Prompt engineering: A guide to improving LLM performance

Prompt engineering is the practice of crafting input queries or instructions to elicit more accurate and desirable outputs from large language models (LLMs). It is a crucial skill for working with artificial intelligence (AI) applications, helping developers achieve better results from language models. Prompt engineering involves strategically shaping input prompts, exploring the nuances of language, and experimenting with diverse prompts to fine-tune model output and address potential biases.

Dave Farley reflects on 25 years of software development & the future of AI

In this episode Rob is joined by Dave Farley, software legend and author of books "Continuous Delivery" and "Modern Software Engineering”. The two tackle the essence of software development culture and the current state of software delivery. They unpack why it’s important to prioritize problem-solving abilities over technical skills when it comes to hiring, emphasizing a healthy culture and the need for continuous learning on the job.

Testing a PyTorch machine learning model with pytest and CircleCI

PyTorch is an open-source machine learning (ML) framework that accelerates the path from research prototyping to production deployment. You can work with PyTorch using regular Python without delving into the underlying native C++ code. It contains a full toolkit for building production-worthy ML applications, including layers for deep neural networks, activation functions and optimizers. It also has associated libraries for computer vision and natural language processing.