Operations | Monitoring | ITSM | DevOps | Cloud

Kublr

Kubernetes in Highly Restrictive Environments

Installing Kubernetes is easy. Ensuring it complies with your organization’s enterprise governance and security requirements aren’t. Oleg will outline a plan to use the technology while meeting enterprise security requirements. In this technically-focused talk, he’ll summarize common prerequisites for running Kubernetes in production, and how to leverage fine-grained controls and separation of responsibilities to meet enterprise governance and security needs.

Canary Release on Kubernetes with Spinnaker, Istio, and Prometheus

In a microservices world, applications consist of dozens, hundreds, or even thousands of components. Manually deploying and verifying deployment quality in production is virtually impossible. Kubernetes, which natively supports rolling updates, enables blue-green application deployments with Spinnaker. However, the gradual rollout is a feature that doesn't come out-of-the-box but can be achieved by adding Istio and Prometheus to the equation.

Centralizing Kubernetes and Container Operations

While developers see and realize the benefits of Kubernetes, how it improves efficiencies, saves time, and enables focus on the unique business requirements of each project; InfoSec, infrastructure, and software operations teams still face challenges when managing a new set of tools and technologies, and integrating them into existing enterprise infrastructure. 

Enabling Digital Transformation with Container Technologies

Digital transformation may be in danger of becoming an overused buzzword. Yet, real business needs are driving this trend and IT leaders feel the pressure to transform their businesses every day. Whether it is the need for speed, agility, or rethinking business processes as a whole – these challenges are here to stay.

Application Deployment with Kubernetes

Kubernetes ensures your deployed applications are always available to users. But how do you deploy applications in Kubernetes without user/service interruptions? Should you write your own scripts using low-level Kubernetes objects, package everything in Helm, or use specific CI/CD tools? There isn’t a clear-cut answer; it always depends.

Kubernetes, Data Science and Machine Learning

Enabling support for data processing, data analytics, and machine learning workloads in Kubernetes has been one of the goals of the open-source community. During this meetup, we’ll discuss the growing use of Kubernetes for data science and machine learning workloads. We’ll examine how new Kubernetes extensibility features such as custom resources and custom controllers are used for applications and frameworks integration. Apache Spark 2.3.’s native support is the latest indication of this growing trend. We’ll demo a few examples of data science workloads running on Kubernetes clusters setup by our Kublr platform.

Autoscaling? Kubernetes Pods vs. Nodes

Not only does it deploy and manage containers, Kubernetes autoscaling enables users to automatically scale the overall solution in numerous ways. This is a tremendous asset, especially in the modern cloud, where costs are based on the resources consumed. Not only does Kubernetes have the capacity to deploy and manage containers, it can also automatically scale the overall solution in numerous ways.

Running Spark with Jupyter Notebook & HDFS on Kubernetes

Kublr and Kubernetes can help make your favorite data science tools easier to deploy and manage. Hadoop Distributed File System (HDFS) carries the burden of storing big data; Spark provides many powerful tools to process data; while Jupyter Notebook is the de facto standard UI to dynamically manage the queries and visualization of results.

Kubernetes and the Data Layer

Once you get your head around the concept of containers, and subsequently the need for management and orchestration with tools like Kubernetes, what started off as a weekend project suddenly starts to raise more questions than answers. Kubernetes removes much of the complexity of managing the interaction between applications and the underlying infrastructure. It is designed to let developers focus on the applications and solutions rather than worrying about the complexity of the hosting platform.