The latest News and Information on Cloud monitoring, security and related technologies.
Throughout this series we have been exploring how to use serverless architectures to our advantage. In this article I will show you: How to create a serverless slack command using Node.js & Up, Best Practices when developing serverless applications, A curated list of serverless resources.
In Part 2: Serverless Scales I briefly touched on how a serverless architecture can have a cost benefit. In this post, I will go over: How to approach and analyze the cost of serverless, Two detailed examples of a cost analysis.
In Part 1: What is Serverless? I talked about how one of the biggest pros to a serverless architecture is how well it scales and how high availability is baked in. In this post I’ll go over: How a traditional highly available scalable architecture works, How a scalable serverless architecture works, How you can benefit from a serverless architecture.
In this post we’ll answer the following questions: What is serverless architecture? (and what it’s not), What are the pros & cons of serverless?
One of the biggest challenges in a self-provisioned, public cloud environment like Amazon Web Services (AWS) is finding the right balance between resources, performance, and cost. With no initial visibility into usage stats, AWS customers tend to overprovision compute, storage, and database resources to cushion sudden spikes in demand. If users could see resource usage, they'd be able to determine if the numbers provisioned are really in line with the application workload.
Developers working hand in hand with IT professionals is vitally important to maintaining stable application environments with mininal errors. Ideally, they work in the same environment. This is called DevOps. Learn why DevOps and collaboration is necessary.