Operations | Monitoring | ITSM | DevOps | Cloud

Databases

The latest News and Information on Databases and related technologies.

MongoDB use cases for the telecommunications industry

A trusted database is fundamental to the smooth and secure operation of telecommunications services:, from network management and customer service to compliance and fraud prevention. MongoDB is one of the most widely used databases (DB Engines, 2024) for enterprises, including those in the telecommunications industry. It provides a sturdy, adaptable and trustworthy foundation. It also safeguards sensitive customer data while facilitating swift responses to rapidly evolving situations.

Monitor Amazon MemoryDB with Datadog

Amazon MemoryDB for Redis is a highly durable in-memory database service that uses cross-availability-zone data storage and fast failover, providing microsecond read times and single-digit-millisecond write times. Datadog’s integration for MemoryDB uses a range of metrics to provide important visibility into MemoryDB performance.

Aligning Database Operations With DevOps

Even with dedicated database staff (often with impressive technical skills), businesses need help understanding how to address database issues. IT managers and database specialists are keen to try different approaches to keeping databases working as efficiently as possible. One popular approach is ‘DevOps’. As companies navigate the complexities of digital transformation and a growing trend toward cloud migration, IT environments have become extraordinarily complex.

Devart Has Become a Digital Sponsor of SQLBits 2024

On March 19-23, Devart joined SQLBits2024, a non-profit yearly conference connecting 2279 attendees from 60 countries for training, networking, and experience sharing. This venue is a place of power for everyone interested in data engineering, architecture, database administration, analytics, and development with SQL, and we were glad to join the ranks with the people who share the same passion for SQL with us.

How to Connect to Azure SQL Database Using Azure Private Link

Azure Private Link is a secure means of accessing Azure PaaS Services (including Azure SQL Database and Azure Storage) over a private endpoint in a virtual network. In other words, you can create your own private link service in your virtual network and deliver it to your customers without exposing it to the public internet. And if you need a tool to develop and manage Azure databases in this environment, there’s no better option than dbForge Studio for SQL Server.

10 tips for Test Data Management success

The role of effective Test Data Management (TDM) is often underestimated in the software development process, yet it is a cornerstone for ensuring quality, compliance, and efficiency throughout the software development life cycle. As Bloor say in their Test Data Management Market Update 2024, “… many enterprises are, to quote one vendor we spoke to, “still in the stone age” when it comes to TDM.

How to set up an open source database monitoring stack with Grafana Cloud

One of the great powers of Grafana is the open source community behind it — a community that provides a breadth of ready-to-use dashboards, plugins, exporters, and instructions that make a million tasks easier. The sheer scale of it all means whatever you need probably already exists somewhere. To illustrate this, I want to share an example of how to use these tools as a base for building a comprehensive database monitoring solution.

Discovering MongoDB Atlas: Perfect for Gen AI-powered Apps

As established in OpsMatters' previous AI articles, generative AI is changing the way digital experiences are created. With its ability to generate new content from existing data - whether that be images, music, or human-like text - GenAI has opened up new possibilities for developers and businesses alike. However, to make any GenAI-powered application run smoothly, you need a reliable database.

How to migrate MySQL databases to a cloud environment?

Migrating MySQL databases to a cloud environment can seem daunting, but with the right approach, it ensures minimal downtime and a seamless transition. The key to a successful migration lies in thorough planning and preparation. This involves understanding your database's complexity, estimating the data volume, and determining the levels of downtime acceptable for your organization.