Skip to content

Mainframe modernization with AWS: A complete guide for 2026

10 minute read
Content level: Intermediate
2

This article explains all the tool sets and features available to migrate and modernize your mainframe applications on AWS

Overview

AWS provides a comprehensive set of tools and services to migrate and modernize your mainframe applications at scale.

There are various migration patterns popularly know as the 8 R APPROACH to select from when migrating legacy applications. Customers can choose from a diverse set of tools from AWS, based on their migration requirements and patterns.

Migration PatternSource LanguageToolsetsTarget Language
ReimagineCOBOL, PL/1AWS Transform for mainframe + KiroJava, Python, .Net, NodeJs, ..
ReimagineOther than COBOL, PL/1AWS Transform custom + KiroJava, Python, .Net, NodeJs, ..
Refactorz/OS COBOL, PL/1AWS Transform for mainframeJava, Angular, Groovy
RefactorAssemblermLogicaCOBOL, Java
ReplicateDB2, IMS, VSAM, FilePreciselyRDS, DynamoDB, Kafka, Amazon S3
ReplatformCOBOL, PL/1Rocket (Micro Focus), NTT DATACOBOL, PL/1

Enter image description here

⛔ Note: The managed runtime environment experience of AWS Mainframe Modernization service is no longer available to new customers. However, the self-managed runtime version is still available for both Rocket Software (replatform) and AWS Blu Age (refactor) capabilities.

Some of the AWS Blu Age features are now merged under the new service AWS Transform for mainframe.


Into the details - the Big Picture

Enter image description here

Figure: AWS mainframe migration toolsets with their components


✨ I. AWS Transform

⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘

AWS Transform is the first agentic AI service for large-scale migration and modernization of legacy applications. It cuts modernization timelines from years to months at a significantly reduced cost by streamlining the core phases of mainframe modernization, starting from initial analysis and planning to code refactoring and application reimagining.

Built on 4 software layers:

⚙️ 1. Amazon AgentCore layer

∘₊✧──────────────✧₊∘

The bottommost layer is the orchestration layer using Amazon AgentCore which is used to build, deploy, and operate AI agents securely and effectively with minimal code.

AgentCore hides all complexities of managing the infrastructure and provides out-of-the-box features like intelligent memory, gateway monitoring and security. Memory enables intelligent personalized experiences that can maintain knowledge and learn from user interactions. Gateway provides secure, controlled access to tools and data needed by the agents to provide optimal response.

It runs on top of AWS own purpose-built hardware layer of Trainium and Inferentia chips, and Graviton GPU processors to deliver high performance at a lower cost, reducing the carbon footprints.


📦 2. Foundation Layer

∘₊✧──────────✧₊∘

Next from bottom is the Foundation Layer that hosts the essential and supporting tools for running the agents.

This layer is comprised of various components -

🗂️ Workspace manager:
The resource that contains other resources like connectors and jobs. A workspace serves as a permissions boundary. Connectors are asset providers that allow access to customer-owned resources in a system external to AWS Transform such as the source code stored in Amazon S3. Results generated from AWS Transform are saved in the artifact store in S3.
📜 Job Orchestration:
The Job is a long-running process (days) that AWS Transform is working on in order to fulfill an objective defined by a user. It is made up of multiple tasks and collaborator requests. At different stages, the job interacts with the user (human-in-the-loop [HITL]) for validation and confirmation. Progress and history of a job can be viewed in the Worklogs.
🤖 Chat Service:
User interaction with AWS Transform is primarily via chat to create a job, define job objectives, provide confirmations as part of HITL, restart a job and ask questions about the legacy application.
🧠 Knowledge Base:
Users can enable the Amazon S3 Vector store within AWS Transform to create an extensive Knowledge Base to provide an AI powered search and enhanced chat experience.
🏷️ Agent Registry:
Partners and customers can bring and register their own specialized agents within AWS Transform via 🧩 Composability and run as part of the migration workflow.
🚧 The agent registration process is not automated at this time. Please contact your AWS representative for more details.
👉 Learn more about Composability in the blog
👉 View interactive demo from a couple of early adopters here.
🛡️ Authorization:
The authorization module maintains the security and access among the various components within AWS Transform. It also orchestrates the isolation within AWS account boundaries and user access within workspaces.
👉 Learn more about security here
🎓 Learning:
The agents learn from user interactions and feedback.

⚛ 3. Agents Layer

∘₊✧───────✧₊∘

Third from bottom is the Domain layer that hosts the set of AI agents for different domains - Mainframe, VMWare, Windows and Custom.

There are several mainframe agents which can be combined into a workflow called Job. The job can be created by defining a migration objective to the chat agent. The agents can be broadly categorized into 4 different groups -

🔬 Analysis:
The analysis agents automates analysis of mainframe codebases and categorizes code components including JCL, BMS, COBOL programs, and copybooks. The insights identify dependencies and detect missing artifacts, generating visual representations of dependencies alongside key metrics such as lines of code and component classification.

Code: Analyzes different source code artifacts to find the types, lines of code, code complexities, dependencies and missing elements. This discovery feature is a mandatory step that needs to be executed as the first step for all jobs.


Data Source: Identifies the data sources like VSAM, DB2, IMS, files, etc. present in the provided source codes, along with their metadata and usage.


Activity: Combines static analysis results with mainframe runtime activity data to create the usage report of the batch jobs, online transactions and their associated elements.

📌 Note: The outputs from the analysis steps are generated using a deterministic approach.

📝 Documentation:
The documentation agents generate comprehensive documentation of mainframe applications, capturing business logic summaries, acronym definitions, and context-aware insights. This documentation acts as a structured knowledge base, making it easier for your teams to query and understand applications. These agents use generative AI heavily, as a result some of the steps run for a long time (hours/days).

Business: Generates application business level documentation with summary, key business functionalities, process flow and business rules. It also identifies the business domains present in the supplied application code, and creates the documentation at application, domain, entry-point and component level. The results can be viewed on browser and are also available for download as HTML or JSON for offline viewing.


Technical: Generates technically focused documentation with program logic, data source, integrations, input and output details. The documents can be downloaded as PDF or XML format.

🗓️ Migration Planning:
Simplifies modernization and reduces risk by breaking down large, monolithic applications into smaller, manageable domains and generating the planning guidance.

Decomposition: Decomposes the source code elements from the monolithic mainframe application and group them into domains, finding the natural line of demarcation. Using semantic seeding, it organizes applications into logical domains while advanced dependency detection ensures proper separation of classified code.


Wave Planning: Generates a migration wave plan with recommended modernization sequence and approximate timeline, prioritizing transformation phases based on lines of code, business objectives, or dependencies.

🏗 Migration:
The migration agents help in transformation of code and generation of test scripts and data following refactoring pattern.

Refactor: Transforms millions of lines of COBOL and PL/1 code into Java/Angular in minutes preserving exact business functionalities using automated tooling from AWS Blu Age.


Reforge: Optimizes the refactored Java code for maintainability and readability.


Testing: Generates automated test plans, data collection scripts, and execution automation based on actual application dependencies and business logic


👉 Learn more about the Refactor approach in the blog and testing strategy in the blog.

👉 View interactive demo here.

👉 Reach out to AWS contact to schedule a guided hands-on refactor workshop for your organization.

🧩🧩 Custom:
AWS Transform custom can be used to handle migration of unsupported technologies. The agents can be trained via examples and refined through iteration to build the custom transformation process.

👉 View interactive demo here.


🖥️ 4. UI Layer

∘₊✧─────✧₊∘

Users interact with AWS Transform using a web browser. In case of AWS Transform custom some activities have to be performed via command line interface (CLI).

User authorization to workspaces and jobs within AWS Transform is maintained by an admin from AWS Console. The available options to onboard users are IAM Identity Center, Azure Active Directory (Entra ID) and Okta Workforce Identity. Details can be found here.


👻 II. Kiro

⫘⫘⫘⫘⫘⫘⫘⫘

💫 1. Forward Engineering

∘₊✧────────────✧₊∘

Kiro is an AI-powered, agentic IDE that uses a "spec-driven development" approach, helping developers go from natural language ideas to production-ready code. It can generate cloud native code, test cases, test data and automation scripts to create the AWS resources.

Kiro can be used to reimagine the legacy application by using the artifacts and documentation generated in AWS Transform. The forward engineering can be seamlessly orchestrated in stages by generating specs (requirements, design, tasks) and automating implementation, testing, and documentation with specialized AI agents. Advanced steering, knowledge base MCP and custom agents can accelerate the migration journey.


👉 Learn more about the Reimagine approach in the blog & article.

👉 View interactive demo here.

👉 Reach out to AWS contact to schedule a guided hands-on reimagine workshop for your organization.


📝 2. Unsupported languages

∘₊✧──────────────✧₊∘

Kiro can also be used to analyze and document legacy applications built using programming languages not supported by AWS Transform.


🎛️ III. Additional Migration Options

⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘⫘

AWS provides additional migration tools and services through the partner network.

⿻ 1. Replicate
Augment mainframe functions and unleash innovation by leveraging mainframe-based data using near real-time data replication with Precisely and file transfer with BMC.
🗄️ 2. Replatform
Migrate COBOL and PL/I applications with the integrated Rocket (Micro Focus) and NTT DATA toolchains to preserve the programming language while modernizing infrastructure and processes for agility with cloud DevOps native operations. With this toolchain you can also modernize mainframe relational databases, data sets, and hierarchical databases. The applications are modernized to the AWS Cloud with minimal changes to the source code. It supports both online transactions and batch jobs.
🔢 3. Refactor (Assembler)
Automatically convert z/OS mainframe legacy Assembler code to COBOL or Java using LIBER*M from mLogica.
🤜🤛 4. Mainframe Migration Competency Partner Network
AWS has a strong collaboration with different partners with specialized tools and service offerings. Find the list of partners here

────────⋆⋅☆⋅⋆────────

📚 References:

  1. Customer references from re:Invent 2025: Western Union & Unum, BMW & Fiserv, Itau & ADP

  2. List of other resources

  3. Interactive demos home page