Operations | Monitoring | ITSM | DevOps | Cloud

%term

Customers Demanding New Features and Unable to Provide Quickly?

With the adoption of Agile methodology, it is expected to add new features quickly to an application or product. However, if the process of moving from Dev > Test > Stage > Prod is taking weeks or months – then you have a problem at hand (big or small, varies on the type of app/product). Customer will be demanding new features and the development team will be able to build/ create them quickly, which is a good thing!

Citrix Cloud 101: Key Questions Every Citrix Admin Wants Answered

A few weeks back, eG Innovations collaborated with David Wilkinson and conducted a webinar on the topic “Is Citrix Cloud Enterprise Ready? Best Practices to get the Most Out of Citrix Cloud Deployments.” Citrix Cloud implementations are growing in the industry today, and as organizations begin evaluating their cloud options, Citrix administration teams want to understand how Citrix Cloud will sustain, scale and be supported in lieu of on-premises Citrix deployments.

Unleash the Power of Anywhere IT Ops with Enterprise Alert and its Mobile App

When we introduced ‘remote actions’ in 2012, i.e. the execution of IT automation tasks from your smartphone, we aimed at empowering the mobile (IT) workforce of the future. We aimed at relieving IT people from being bound to their desks, notebooks and PCs.

How to Read Log Files on Windows, Mac, and Linux

Logging is a data collection method that stores pieces of information about the events that take place in a computer system. There are different kinds of log files based on the kind of information they contain, the events that trigger log creation, and several other factors. This post focuses on log files created by the three main operating systems--Windows, Mac, and Linux, and on the main differences in the ways to access and read log files for each OS.

Dynamically Provisioning Local Storage in Kubernetes

At LogDNA, we’re all about speed. We need to ingest, parse, index, and archive several terabytes of data per second. To reach these speeds, we need to find and implement innovative solutions for optimizing all steps of our pipeline, especially when it comes to storing data.