This tutorial guides you on how to use the Amazon SageMaker Orb to orchestrate model deployment to endpoints across different environments. It also shows how to use the CircleCI platform to monitor and manage promotions and rollbacks. It will use an example project repository to walk you through every step, from training a new model package version to deploying your model across multiple environments.
Top tips is a weekly column where we highlight what’s trending in the tech world today and list out ways to explore these trends. This week, we’re examining four use cases for AI in the ever-growing FinTech sector. The FinTech sector has transformed the discussion around the financial services industry from top to bottom.
Amazon Bedrock is a fully managed service that offers foundation models (FMs) built by leading AI companies, such as AI21 labs, Meta, and Amazon along with other tools for building generative AI applications. After enabling access to validation and training data stored in Amazon S3, customers can fine-tune their FMs to invoke tasks such as text generation, content creation, and chatbot Q&A—without provisioning or managing any infrastructure.
Generative AI is revolutionizing the way businesses operate, from improving operational resilience to mitigating security risks and enhancing customer experiences. In a recent roundup of c-suite insights from three IT leaders — Matt Minetola, CIO, Mandy Andress, CISO, and Rick Laner, chief customer officer — we gain a comprehensive understanding of how generative AI is being used to improve business outcomes across organizations.
Generative AI has the world thinking about automation now more than ever before. The Information Technology Infrastructure Library (ITIL) has prioritized it from the start. ITIL has advocated for automation as a transformative tool for organizations to deliver business value, accelerate change, and reinvent service configuration management. By handling mundane tasks, automation can empower people to do more innovative and effective work.