IBM watsonx AI and data platform, security solutions and consulting services for generative AI to be showcased at AWS re:Invent

According to a Gartner® report, “By 2026, more than 80% of enterprises will have used generative AI APIs or models, and/or deployed GenAI-enabled applications in production environments, up from less than 5% in 2023.”* However, to be successful they need the flexibility to run it on their existing cloud environments. That’s why we continue expanding the IBM and AWS collaboration, providing clients flexibility to build and govern their AI projects using the watsonx AI and data platform with AI assistants on AWS.

With sprawling data underpinning these AI projects, enterprises are increasingly looking to data lakehouses to bring it all together in one place where they can access, cleanse and manage it. To that end, watsonx.data, a fit-for-purpose data store built on an open data lakehouse architecture, is already available as a fully managed software-as-a-service (SaaS) on Red Hat OpenShift and Red Hat OpenShift Services on AWS (ROSA)—all accessible in the AWS Marketplace.

The watsonx.governance toolkit and watsonx.ai next generation studio for AI builders will follow in early 2024, making the full watsonx platform available on AWS. This provides clients a full stack of capabilities to train, tune and deploy AI models with trusted data, speed and governance with increased flexibility to run their AI workflows wherever they reside.

During AWS ReInvent, IBM will show how clients accessing Llama 2 from AWS Sagemaker will be able to use the watsonx.governance toolkit to govern both the training data and the AI to operate and scale with trust and transparency. Watsonx.governance can also help manage these models against regulatory guidelines and risks tied to the model itself and the application using it.

We’ll also be unveiling several exciting pieces of news about our fast-growing partnership, and showcasing the following joint innovations:

  • IBM Security’s Program for Service Providers: A new program for Managed Security Service Providers (MSSPs) and Cloud System Integrators to accelerate their adoption of IBM security software delivered on AWS. This program helps security providers develop and deliver threat detection and data security services, designed specifically for protecting SMB clients. It also enables service providers to deliver services that can be listed in the AWS Marketplace, leveraging IBM Security software, which feature AWS built-in integrations — significantly speeding and simplifying onboarding.
  • Apptio Cloudability and IBM Turbonomic Integration: Since IBM’s acquisition of Apptio closed in August, teams have been working on the integration of Apptio Cloudability, a cloud cost-management tool, and Turbonomic, an IT resource management tool for continuous hybrid cloud optimization. Today, key optimization metrics from Turbonomic can be visualized within the Cloudability interface, providing deeper cost analysis and savings for AWS Cloud environments.
  • Workload Modernization: We’re providing tools and services for deployment and support to simplify and automate the modernization and migration path for on-premise to as-a-service versions of IBM Planning AnalyticsDb2 Warehouse and IBM Maximo Application Suite on AWS.
  • Growing Software Portfolio: We now have 25 SaaS products currently available on AWS including watsonx.data, APP Connect, Maximo Application Suite, IBM Turbonomic and three new SaaS editions of Guardium Insights. There are now more than 70 IBM listings in the AWS marketplace. As part of an ongoing global expansion of our partnership, the IBM software and SaaS catalog (limited release) is now available for our clients in Denmark, France, Germany and the United Kingdom to procure via the AWS Marketplace.

In addition to these software capabilities, IBM is growing its generative AI capabilities and expertise with AWS—delivering new solutions to clients and training thousands of consultants on AWS generative AI services. IBM also launched an Innovation Lab in collaboration with AWS at the IBM Client Experience Center in Bangalore. This builds on IBM’s existing expertise with AWS generative AI services including Amazon SageMaker and Amazon CodeWhisperer and Amazon Bedrock.

IBM is the only technology company with both AWS-specific consulting expertise and complementary technology spanning data and AI, automation, security and sustainability capabilities—all built on Red Hat Open Shift Service on AWS—that run cloud-native on AWS.

For more information about the IBM and AWS partnership, please visit www.ibm.com/aws. Visit us at AWS re:Invent in booth #930. Don’t miss these sessions from IBM experts exploring hybrid cloud and AI:

  • Hybrid by Design at USAA: 5:00 p.m.​, Tuesday, November 28, The Venetian, Murano 3306
  • Scale and Accelerate the Impact of Generative AI with watsonx: 4:30 p.m., Wednesday, November 29, Wynn Las Vegas, Cristal 7

Learn more about the IBM and AWS partnership


*Gartner. Hype Cycle for Generative AI, 2023, 11 September 2023. Gartner and Hype Cycle are registered trademarks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

Source: IBM Blockchain

Application modernization overview

Application modernization is the process of updating legacy applications leveraging modern technologies, enhancing performance and making it adaptable to evolving business speeds by infusing cloud native principles like DevOps, Infrastructure-as-code (IAC) and so on. Application modernization starts with assessment of current legacy applications, data and infrastructure and applying the right modernization strategy (rehost, re-platform, refactor or rebuild) to achieve the desired result.

While rebuild results in maximum benefit, there is a need for high degree of investment, whereas rehost is about moving applications and data as such to cloud without any optimization and this requires less investments while value is low. Modernized applications are deployed, monitored and maintained, with ongoing iterations to keep pace with technology and business advancements. Typical benefits realized would range from increased agility, cost-effectiveness and competitiveness, while challenges include complexity and resource demands. Many enterprises are realizing that moving to cloud is not giving them the desired value nor agility/speed beyond basic platform-level automation. The real problem lies in how the IT is organized, which reflects in how their current applications/services are built and managed (refer to Conway’s law). This, in turn, leads to the following challenges:

  • Duplicative or overlapping capabilities offered by multiple IT systems/components create sticky dependencies and proliferations, which impact productivity and speed to market.
  • Duplicative capabilities across applications and channels give rise to duplicative IT resources (e.g., skills and infrastructure)
  • Duplicative capabilities (including data) resulting in duplication of business rules and the like give rise to inconsistent customer experience.
  • Lack of alignment of IT capabilities to business capabilities impacts time to market and business-IT. In addition, enterprises end up building several band-aids and architectural layers to support new business initiatives and innovations.

Hence, application modernization initiatives need to be focusing more on the value to business and this involves significant element of transformation of the applications to business capabilities aligned components and services. The biggest challenge with this is the amount of investment needed and many CIOs/CTOs are hesitant to invest due to the cost and timelines involved in realizing value. Many are addressing this via building accelerators that could be customized for enterprise consumption that helps accelerate specific areas of modernization and one such example from IBM is IBM Consulting Cloud Accelerators. While attempting to drive acceleration and optimize cost of modernization, Generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. We will explore key areas of acceleration with an example in this article.

A simplified lifecycle of application modernization programs (not meant to be exhaustive) is depicted below. Discovery focuses on understanding legacy application, infrastructure, data, interaction between applications, services and data and other aspects like security. Planning breaks down the complex portfolio of applications into iterations to be modernized to establish an iterative roadmap—and establishing an execution plan to implement the roadmap.

Blueprint/Design phase activities change based on the modernization strategy (from decomposing application and leveraging domain-driven design or establish target architecture based on new technology to build executable designs). Subsequent phases are build and test and deploy to production. Let us explore the Generative AI possibilities across these lifecycle areas.

Discovery and design:

The ability to understand legacy applications with minimal SME involvement is a critical acceleration point. This is because, in general, SMEs are busy with systems lights-on initiatives, while their knowledge could be limited based on how long they have been supporting the systems. Collectively, discovery and design is where significant time is spent during modernization, whereas development is much easier once the team has decoded the legacy application functionality, integration aspects, logic and data complexity.

Modernization teams perform their code analysis and go through several documents (mostly dated); this is where their reliance on code analysis tools becomes important. Further, for re-write initiatives, one needs to map functional capabilities to legacy application context so as to perform effective domain-driven design/decomposition exercises. Generative AI becomes very handy here through its ability to correlate domain/functional capabilities to code and data and establish business capabilities view and connected application code and data—of course the models need to be tuned/contextualized for a given enterprise domain model or functional capability map. Generative AI-assisted API mapping called out in this paper is a mini exemplar of this. While the above is for application decomposition/design, event-storming needs process maps and this is where Generative AI assists in contextualizing and mapping extracts from process mining tools. Generative AI also helps generate use cases based on code insights and functional mapping. Overall, Generative AI helps de-risk modernization programs via ensuring adequate visibility to legacy applications as well as dependencies.

Generative AI also helps generate target design for specific cloud service provider framework through tuning the models based on a set of standardized patterns (ingress/egress, application services, data services, composite patterns, etc.). Likewise, there are several other Generative AI use cases that include generating of target technology framework-specific code patterns for security controls. Generative AI helps to generate detail design specifications, for example, user stories, User Experience Wire Frames, API Specifications (e.g., Swagger files), component relationship diagram and component interaction diagrams.

Planning:

One of the difficult tasks of a modernization program is to be able to establish a macro roadmap while balancing parallel efforts versus sequential dependencies and identifying co-existence scenarios to be addressed. While this is normally done as a one-time task—continuous realignment through Program Increments (PIs)—planning exercises incorporating execution level inputs is far more difficult. Generative AI comes in handy to be able to generate roadmaps based on historical data (applications to domain area maps, effort and complexity factors and dependency patterns, etc.), applying this to applications in the scope of a modernization program—for a given industry or domain.

The only way to address this is to make it consumable via a suite of assets and accelerators that can address enterprise complexity. This is where Generative AI plays a significant role in correlating application portfolio details with discovered dependencies.

Build and test:

Generating code is one of the most widest known Generative AI use case, but it is important to be able to generate a set of related code artifacts ranging from IAC (Terraform or Cloud Formation Template), pipeline code/configurations, embed security design points (encryption, IAM integrations, etc.), application code generation from swaggers or other code insights (from legacy) and firewall configurations (as resource files based on services instantiated, etc.). Generative AI helps generate each of the above through an orchestrated approach based on predefined application reference architectures built from patterns—while combining outputs of design tools.

Testing is another key area; Generative AI can generate the right set of test cases and test code along with test data so as to optimize the test cases being executed.

Deploy:

There are several last mile activities that typically takes days to weeks based on enterprise complexity. The ability to generate insights for security validation (from application and platform logs, design points, IAC, etc.) is a key use case that will help assist accelerated security review and approval cycles. Generating configuration management inputs (for CMDB)and changing management inputs based on release notes generated from Agility tool work items completed per release are key Generative AI leverage areas.

While the above-mentioned use cases across modernization phases appear to be a silver bullet, enterprise complexities will necessitate contextual orchestration of many of the above Generative AI use cases-based accelerators to be able to realize value and we are far from establishing enterprise contextual patterns that help accelerate modernization programs. We have seen significant benefits in investing time and energy upfront (and ongoing) in customizing many of these Generative AI accelerators for certain patterns based on potential repeatability.

Let us now examine a potential proven example:

Example 1: Re-imagining API Discovery with BIAN and AI for visibility of domain mapping and identification of duplicative API services

The Problem: Large Global Bank has more than 30000 APIs (both internal and external) developed over time across various domains (e.g., retail banking, wholesale banking, open banking and corporate banking). There is huge potential of duplicate APIs existing across the domains, leading to higher total cost of ownership for maintaining the large API portfolio and operational challenges of dealing with API duplication and overlap. A lack of visibility and discovery of the APIs leads API Development teams to develop the same or similar APIs rather than find relevant APIs for reuse. The inability to visualize the API portfolio from a Banking Industry Model perspective constrains the Business and IT teams to understand the capabilities that are already available and what new capabilities are needed for the bank.

Generative AI-based solution approach: The solution leverages BERT Large Language Model, Sentence Transformer, Multiple Negatives Ranking Loss Function and domain rules, fine-tuned with BIAN Service Landscape knowledge to learn the bank’s API portfolio and provide ability to discover APIs with auto-mapping to BIAN. It maps API Endpoint Method to level 4 BIAN Service Landscape Hierarchy, that is, BIAN Service Operations.

The core functions of solution are the ability to:

  • Ingest swagger specifications and other API documentations and understand the API, end points, the operations and the associated descriptions.
  • Ingest BIAN details and understand BIAN Service Landscape.
  • Fine-tune with matched and unmatched mapping between API Endpoint Method and BIAN Service Landscape.
  • Provide a visual representation of the mapping and matching score with BIAN Hierarchical navigation and filters for BIAN levels, API Category and matching score.

Overall logical view (Open Stack based) is as below:

User Interface for API Discovery with Industry Model:

Key Benefits: The solution helped developers to easily find re-usable APIs, based on BIAN business domains; they had multiple filter/search options to locate APIs. In addition, teams were able to identify key API categories for building right operational resilience. Next revision of search would be based on natural language and will be a conversational use case.

The ability to identify duplicative APIs based on BIAN service domains helped establish a modernization strategy that addresses duplicative capabilities while rationalizing them.

This use case was realized within 6–8 weeks, whereas the bank would have taken a year to achieve the same result (as there were several thousands of APIs to be discovered).

Example 2: Automated modernization of MuleSoft API to Java Spring Boot API

The Problem: While the current teams were on a journey to modernize MuleSoft APIs to Java Spring boot, sheer volume of APIs, lack of documentation and the complexity aspects were impacting the speed.

Generative AI-based Solution Approach: The Mule API to Java Spring boot modernization was significantly automated via a Generative AI-based accelerator we built. We began by establishing deep understanding of APIs, components and API logic followed by finalizing response structures and code. This was followed by building prompts using IBM’s version of Sidekick AI to generate Spring boot code, which satisfies the API specs from MuleSoft, unit test cases, design document and user interface.

Mule API components were provided into the tool one by one using prompts and generated corresponding Spring boot equivalent, which was subsequently wired together addressing errors that propped up. The accelerator generated UI for desired channel that could be integrated to the APIs, unit test cases and test data and design documentation. A design documentation that gets generated consists of sequence and class diagram, request, response, end point details, error codes and architecture considerations.

Key Benefits: Sidekick AI augments Application Consultants’ daily work by pairing multi-model Generative AI technical strategy contextualized through deep domain knowledge and technology. The key benefits are as follows:

  • Generates most of the Spring Boot code and test cases that are optimized, clean and adheres to best practices—key is repeatability.
  • Ease of integration of APIs with channel front-end layers.
  • Ease of understanding of code of developer and enough insights in debugging the code.

The Accelerator PoC was completed with 4 different scenarios of code migration, unit test cases, design documentation and UI generation in 3 sprints over 6 weeks.

Conclusion

Many CIOs/CTOs have had their own reservations in embarking on modernization initiatives due to a multitude of challenges called out at the beginning—amount of SME time needed, impact to business due to change, operating model change across security, change management and many other organizations and so on. While Generative AI is not a silver bullet to solve all of the problems, it helps the program through acceleration, reduction in cost of modernization and, more significantly, de-risking through ensuring no current functionality is missed out. However, one needs to understand that it takes time and effort to bring LLM Models and libraries to enterprise environment needs-significant security and compliance reviews and scanning. It also requires some focused effort to improve the data quality of data needed for tuning the models. While cohesive Generative AI-driven modernization accelerators are not yet out there, with time we will start seeing emergence of such integrated toolkits that help accelerate certain modernization patterns if not many.

Source: IBM Blockchain

Winning the cloud game: Phoning the right friend to answer the cloud optimization question

Cloud optimization is essential as organizations look to accelerate business outcomes and unlock the value of their data. At its core, cloud optimization is the process of correctly selecting and assigning the right resources to a workload or application. But cloud optimization is also a lifecycle process that balances performance, compliance and cost to achieve efficiency. And getting it right is crucial. Gartner predicts that enterprise IT spending on public cloud will exceed 51% by 2025, while Flexera’s State of the Cloud Report in 2023 highlighted that managing cloud spend overtook security as a top challenge facing all organizations for the first time.

Research shows that that 90% of enterprises have a multi-cloud strategy and 80% have a hybrid cloud strategy—a combination of public and private clouds. Only 7% of enterprises are using a single public cloud provider. 

It’s easy to see the complexity of the cloud optimization problem given the use of multi cloud. Many organizations have elected to deploy Cloud Centers of Excellence or FinOps practices with the goal of optimizing cloud spend. But building out a FinOps practice or Cloud Center of Excellence is easier said than done. It takes time and talent. Sometimes organizations are short on both. Cloud optimization goes well beyond simple cost reduction and workload placement and is more about making sure your costs align with your business goals.

Remember the TV game show Who Wants to Be a Millionaire? There was a feature on the show called “Phone a Friend.” On the show, the contestant was connected with a friend over a phone line and was given 30 seconds to read the question and answers and solicit assistance. 

Of course, the contestant wants to call the RIGHT friend—the one that can help them with the correct answer and lead them to the money.

As it relates to cloud optimization, workload placement and app modernization, it feels like enterprises need a Phone a Friend feature. But they need to call the RIGHT friend too. 

Why should you phone a friend? Because you need help to answer the IT question and the clock is ticking. If an enterprise calls the wrong friend, they lose the chance to modernization their apps on time, optimize their costs and digitally transform. In short, by calling the WRONG friend, they lose the game.

Enterprises aren’t just looking for tools that manage resources and costs in multicloud and hybrid cloud environments. Tools are great but you need the right tools in the right order. 

Additionally, organizations need help to build out a roadmap or implement the roadmap especially as they are looking to modernize legacy virtual environments. They want AI powered automation capabilities that span from AIOps to application and infrastructure observability.

They need to win the game. They need the right friend with the right answers to help them win.

There is a strong synergy between digital transformation and IT modernization and it’s a long game. While transformation sets the vision and strategic direction of the organization, IT modernization is the practical implementation of that vision. These initiatives reshape and transform organizations. By embracing new technologies, organizations improve efficiency, enhance customer experience and remain competitive.

No matter where an organization is on this journey, phoning the “Right Friend” can move them along the game board from siloed systems to integrated platforms, from on-premises infrastructure to cloud computing and providing the answers around the move from monolithic applications to microservices. A solid strategy involves knowing which friend to call for guidance to help drive business growth and innovation.

Learn more about IBM and AWS todaySource: IBM Blockchain

The advantages and disadvantages of ERP systems

Enterprise resource planning (ERP) solutions offer organizations a one-stop-shop for managing daily operations. The business management software has gained popularity in the business world as organizations try to keep up with the changing landscape. As with most business solutions, there are advantages and disadvantages of ERP systems to consider.

It’s important to understand how enterprise resource planning can work for an organization and its capabilities at a granular level. Here are some key benefits an enterprise resource planning system can bring when managing all aspects of the business.

Advantages of ERP

Improve customer service

The business world is hyper-competitive and that’s no different when it comes to attracting and retaining customers. The customer service experience is a vital part to an organization and an ERP solution can help advance customer relationship management. Since a new system like ERP software puts all customer information into one place, it can facilitate quicker customer service and a more personalized approach.

ERP stores contact information, order history, past support cases and more in one simplified system. Separately, since ERP will track past orders and real-time inventory the customer is much more likely to receive the correct items on time. If those factors are in place, it’s much more likely a customer leaves happy and will return for more down the road.

Customize reporting

Real-time data reporting is one of the highlights of an ERP solution and why it’s a serious advantage over other business management systems. With ERP reporting tools, organizations can customize reporting across many different functions, such as finance, inventory, procurement and human resources and be able to calculate it depending on what matters most to the organization. This tailor-made approach lets the business measure whichever KPIs they find most important and track performance of different business components.

The other advantage is ERP offers the latest data in real-time. This means if an employee is trying to assess an issue, they don’t have outdated data to analyze and instead have the most accurate and up to date numbers to refer to. The customized reporting can help an organization make informed decisions, which is critical when the business environment is ever-changing.

Expand collaborations

The way that ERP solutions are built make for excellent collaboration across different departments. With integrated applications and data storage all under one solution, teams get a clear picture into how each is functioning and contributing to the business.

With the enterprise resource planning system in place, teams across the organization can communicate freely as they aren’t functioning on separate platforms. The integration on the back-end is extremely important and helps employees integrate and work as one. With access to all data, one employee on a completely irrelevant team might be able to point out a malfunction or something that cuts down on duplicate work. This expanded collaboration can increase decision-making, while being a single source of truth for all data entry.

Greater sustainability

The fast-paced ever changing business world has seen a big emphasis on sustainability. C-suites are facing pressure from boards, investors, customers and others to regulate the negative impact of their carbon emissions.

To find out how organizations use ERP implementation to attain sustainability goals, the IBM Institute for Business Value (IBV) and SAP, in collaboration with Oxford Economics, surveyed more than 2,125 senior executives involved in their organizations’ environmental sustainability strategies—around the world and across industries. The surprising result: those who outperform their competition in both environmental and financial outcomes also boast the most deeply engaged ERP implementation.

Improve transparency and insights

One of the benefits of ERP is that it offers full access to every business function and process in an organization all in one place. With the implementation of ERP, data from every department can be accessed by executive-level employees. The ERP solution monitors data daily and can provide day-to-day information, helping an organization be as precise as possible when it comes to factors such as inventory levels and business operations.

The complete visibility ERP provides gives organization leaders better functional business insights and more accurate business forecasting. As a result, this can streamline tasks and make clearer, more concise workflows. In addition, having accurate forecasting models is a competitive advantage, as they allow for improved data-driven strategy and decision-making. As ERP can monitor each department and keep all data in one place, there’s an opportunity for more efficient processes and improved cross-collaboration. In addition, ERP can improve business data security across the whole organization for both on-premises and cloud-based ERP systems.

An example of the success of an ERP implementation is Neste, a market leader in renewable diesel, sustainable aviation fuel, and renewable polymers and chemicals based in Espoo, Finland. The company took a joint-team approach when it came to implementing its new ERP system. Neste worked with IBM Consulting™ for SAP to roll out the SAP S/4HANA solution on the Microsoft Azure cloud across most of its operations, including its renewables supply chains. Neste’s new ERP platform is enabling supply chain process efficiency improvements and making its data more transparent. “Among the most far-reaching benefits,” notes Neste Head of Integrated ERP, Marko Mäki-Ullakko, “is the ability to spot and resolve process inefficiencies.”

“We’ve been able to use SAP’s process discovery capabilities to spot supply chain and production bottlenecks,” he explained. “In that way, integrated SAP has been and will be a critical tool for our process optimization efforts.”

Increase flexibility and scalability

One of the unique features of ERP software is the inclusion of applications or modules across many different business needs. ERP applications, such as procurement, supply chain management, inventory and project management, are all separate applications offered under ERP.

ERP applications can stand on their own but can also be integrated in the entirety of the ERP system, making for easier scalability and configuration in an organization. By being able to add or take away applications, ERP can help scale a business as it evolves over time.

Scalability will look different depending on which ERP solution your organization chooses to use. If a business plans to grow rapidly over time the cloud-based ERP system is the best choice since cloud ERP systems are run on remote servers.

Increase productivity

By automating different tasks, ERP software frees up employees to work on more pertinent tasks and increased efficiency. The ERP system boosts productivity in a range of different ways that all stem from the automation of basic tasks and making processes more straightforward. With the streamlined approach from an ERP system, there is less time dedicated to digging up information and allows for employees to perform other tasks faster. Manual data entry is not necessary, making tasks such as inventory management much easier and making metrics tracking much simpler.

With a lens into the entire organization, employees are no longer tasked with tracking down the right data set or the employee who knows how a certain process works and can instead focus on more important tasks and projects. ERP solutions offer these features using technology, such as artificial intelligence (AI), machine learning, robotic process automation and more. These technologies support the automation and intelligent suggestion features in ERP software applications.

Reduce ongoing costs

The way an ERP solution is structured makes it so data input only occurs one time but can serve multiple purposes across the organization. This can result in saving the business time and money as it streamlines redundant tasks. The upfront costs and cost savings will also depend on which type of ERP solution you choose.

Without a centralized ERP software solution, organizations rely on numerous systems to run the business. The more systems, the higher the potential IT costs. An ERP system could potentially reduce those costs. Separately, it could also reduce training requirements for the end-user since they would only need to learn on one system. This could result in more profitability and less disruptions.

Standardize business processes

The purpose of implementing an ERP solution is to highlight and build from an organization’s best practices and consistencies. This allows you to streamline operations and standardize workflows, ultimately to reduce manual labor and human error across your business. Platforms such as customer relationship management (CRM) can simply be integrated into the ERP system.

ERP software offers many advantages, but standardization is one of the most important. By relying on standardization and configuration, organizations could also see reduced project costs and better cross-team collaboration with less friction.   

Disadvantages of ERP

Increase complexity

ERP is an all-encompassing business management tool, and it can be quite complex. The software can be exciting. Organizations can get caught up in that excitement and risk failing to make a well-thought-out plan for ERP implementation.

The processes of some organizations may find the ERP solution to be too large and not well-suited for its needs. This can result in a poor ROI and should be avoided if possible. The best way to avoid these pitfalls is to build role-based user training and simplify your ERP software to fit your organization’s needs.

Add short-term costs

There are multiple factors to consider when thinking about switching to an ERP software. One of them is cost; not only the cost of the software, but the cost of time and resources needed to implement the system and train employees across all departments.

Another aspect of cost is the ongoing operational costs required of an ERP solution, specifically an on-premises ERP solution. The best way to avoid this ongoing cost is to utilize a cloud-based ERP system, which is a Software-as-a-Service (SaaS) solution that can be run from any location.

One other factor to consider is the change management that is required when implementing an ERP system. ERP implementation requires changes to business processes and workflows. These changes are major investments in time and resources. When selecting ERP software, consider these factors and select the system type that best fits your organization’s needs.

More time-consuming

Since ERP is customizable, and not a one-size-fits-all software, it can become very time-consuming. Customization is a huge advantage to the ERP solution, but can be a challenge as it needs to be built from the ground up.  

An implementation process takes time; organizations must prepare for a lengthy process. The time it takes to transfer to the ERP system depends on which legacy system is being used. The best way to avoid this issue is, again, to have an ERP implementation plan in place that is clear, concise and includes an assigned implementation team.

IBM and ERP

The migration from a legacy system to ERP software can be a huge undertaking no matter the size of the organization. When considering an ERP solution, it’s important to bring in experts to help run a smooth and transparent implementation plan.

IBM Consulting® experts can help your organization successfully migrate legacy ERP applications to the cloud, redesign processes to leverage data, AI and automation, and transform finance into a competitive advantage within your business.

SAP managed services for applications and ERP can help manage an organization’s workloads, giving you more time to focus on innovation and new opportunities. Managed services for SAP applications enable agility and resource optimization by supporting and optimizing underlying operational functions. Areas like security and compliance reporting, application management, and service delivery to lines-of-business become more predictable from a pricing, resource and workload perspective.

Explore SAP consulting servicesSource: IBM Blockchain

Your Black Friday observability checklist

Black Friday—and really, the entire Cyber Week—is a time when you want your applications running at peak performance without completely exhausting your operations teams.

Observability solutions can help you achieve this goal, whether you’re a small team with a single product or a large team operating complex ecommerce applications. But not all observability solutions (or tools) are alike, and if you are missing just one key capability, it could cause customer satisfaction issues, slower sales and even top- and bottom-line revenue catastrophes.

The observability market is full of vendors, with different descriptions, features and support capabilities. This can make it difficult to distinguish what’s critical from what is just nice to have in your observability solution.

Here’s a handy checklist to help you find and implement the best possible observability platform to keep all your applications running merry and bright:

  • Complete automation. You need automatic capture to achieve a comprehensive real-time view of your application. A full-stack tool that can automatically observe your environment will minimize mean time to detection (MTTD) and prevent potential outages.
  • High-fidelity data. The most powerful use of data is the ability to contextualize. Without context, your team has no idea how big or small your problem is. Contextualizing telemetry data by visualizing the relevant information or metadata enables teams to better understand and interpret the data. This combination of accuracy and context helps teams make more informed decisions and pinpoint the root causes of issues.
  • Real-time change detection. Monitoring your entire stack with a single platform (from mainframes to mobile) can contribute to your growth. How? You can now see how transactions are zipping around across the internet, keeping the wheels of your commerce well lubricated. Another advantage of real-time detection is the visibility you gain when you connect your application components with your underlying infrastructure. This is important to your IT team’s success, as they now have the visibility of your stack and services and can map them to your dependency.
  • Mobile and website digital experience management. End-user, mobile, website and synthetic monitoring all enable you to improve the end-user experience. You should use an observability tool with real-user monitoring to deliver an exceptional experience for users and accommodate growth. This allows you to track real users’ interactions with your applications, while end-user monitoring captures performance data from the user’s perspective. Synthetic monitoring creates simulated user interactions to proactively identify potential issues, ensuring your applications meet user expectations and performance standards. All three capabilities combined can: provide real-time insights into server performance and website load times; capture user interactions and provide detailed insights into user behaviour; and monitor server loads and traffic distribution. This can automatically adjust load balancing configurations to distribute traffic evenly, preventing server overloads and ensuring a smooth shopping experience.
  • Built-in AI and machine learning. Having AI-assisted root cause analysis in your observability platform is crucial if you want to diagnose the root causes of issues or anomalies within a system or application automatically. This capability is particularly valuable in complex and dynamic environments where manual analysis might be time consuming and less efficient.
  • Visibility deep and wide. The true advantage of full stack lies in connecting your application components with the underlying infrastructure. This is critical for IT success because it grants visibility into your stack and services and maps them to dependencies.
  • Ease of use. An automated and user-friendly installation procedure minimizes the complexity of deployment.
  • Broad platform support. This monitors popular cloud platforms (AWS, GCP, MS Azure, IBM Cloud®) for both Infrastructure as a Service and Platform as a Service with simplified installation.
  • Continuous production profiling. Profiles code issues when they occur for various programming languages, offering visibility into code-based performance hot spots and bottlenecks.

In a market with detection gaps, 10 seconds is too long. Let this checklist guide you as you build a real-time full-stack observability solution that keeps your business running smoothly for the entire holiday season.

Request a demo to learn moreSource: IBM Blockchain

Level up your Kafka applications with schemas

Apache Kafka is a well-known open-source event store and stream processing platform and has grown to become the de facto standard for data streaming. In this article, developer Michael Burgess provides an insight into the concept of schemas and schema management as a way to add value to your event-driven applications on the fully managed Kafka service, IBM Event Streams on IBM Cloud®.

What is a schema

A schema describes the structure of data.

For example:

A simple Java class modelling an order of some product from an online store might start with fields like:

public class Order{

private String productName

private String productCode

private int quantity

[…]

}

If order objects were being created using this class, and sent to a topic in Kafka, we could describe the structure of those records using a schema such as this Avro schema:

{
"type": "record",
"name": “Order”,
"fields": [
{"name": "productName", "type": "string"},
{"name": "productCode", "type": "string"},
{"name": "quantity", "type": "int"}
]
}

Why should you use a schema

Apache Kafka transfers data without validating the information in the messages. It does not have any visibility of what kind of data are being sent and received, or what data types it might contain. Kafka does not examine the metadata of your messages.

One of the functions of Kafka is to decouple consuming and producing applications, so that they communicate via a Kafka topic rather than directly. This allows them to each work at their own speed, but they still need to agree upon the same data structure; otherwise, the consuming applications have no way to deserialize the data they receive back into something with meaning. The applications all need to share the same assumptions about the structure of the data.

In the scope of Kafka, a schema describes the structure of the data in a message. It defines the fields that need to be present in each message and the types of each field.

This means a schema forms a well-defined contract between a producing application and a consuming application, allowing consuming applications to parse and interpret the data in the messages they receive correctly.

What is a schema registry?

A schema registry supports your Kafka cluster by providing a repository for managing and validating schemas within that cluster. It acts as a database for storing your schemas and provides an interface for managing the schema lifecycle and retrieving schemas. A schema registry also validates evolution of schemas.

Optimize your Kafka environment by using a schema registry.

A schema registry is essentially an agreement of the structure of your data within your Kafka environment. By having a consistent store of the data formats in your applications, you avoid common mistakes that can occur when building applications such as poor data quality, and inconsistencies between your producing and consuming applications that may eventually lead to data corruption. Having a well-managed schema registry is not just a technical necessity but also contributes to the strategic goals of treating data as a valuable product and helps tremendously on your data-as-a-product journey.

Using a schema registry increases the quality of your data and ensures data remain consistent, by enforcing rules for schema evolution. So as well as ensuring data consistency between produced and consumed messages, a schema registry ensures that your messages will remain compatible as schema versions change over time. Over the lifetime of a business, it is very likely that the format of the messages exchanged by the applications supporting the business will need to change. For example, the Order class in the example schema we used earlier might gain a new status field—the product code field might be replaced by a combination of department number and product number, or changes the like. The result is that the schema of the objects in our business domain is continually evolving, and so you need to be able to ensure agreement on the schema of messages in any particular topic at any given time.

There are various patterns for schema evolution:

  • Forward Compatibility: where the producing applications can be updated to a new version of the schema, and all consuming applications will be able to continue to consume messages while waiting to be migrated to the new version.
  • Backward Compatibility: where consuming applications can be migrated to a new version of the schema first, and are able to continue to consume messages produced in the old format while producing applications are migrated.
  • Full Compatibility: when schemas are both forward and backward compatible.

A schema registry is able to enforce rules for schema evolution, allowing you to guarantee either forward, backward or full compatibility of new schema versions, preventing incompatible schema versions being introduced.

By providing a repository of versions of schemas used within a Kafka cluster, past and present, a schema registry simplifies adherence to data governance and data quality policies, since it provides a convenient way to track and audit changes to your topic data formats.

What’s next?

In summary, a schema registry plays a crucial role in managing schema evolution, versioning and the consistency of data in distributed systems, ultimately supporting interoperability between different components. Event Streams on IBM Cloud provides a Schema Registry as part of its Enterprise plan. Ensure your environment is optimized by utilizing this feature on the fully managed Kafka offering on IBM Cloud to build intelligent and responsive applications that react to events in real time.

  • Provision an instance of Event Streams on IBM Cloud here.
  • Learn how to use the Event Streams Schema Registry here.
  • Learn more about Kafka and its use cases here.
  • For any challenges in set up, see our Getting Started Guide and FAQs.

Source: IBM Blockchain

Integrating healthcare apps and data with FHIR + HL7

Today’s healthcare providers use a wide variety of applications and data across a broad ecosystem of partners to manage their daily workflows. Integrating these applications and data is critical to their success, allowing them to deliver patient care efficiently and effectively.

Despite modern data transformation and integration capabilities that made for faster and easier data exchange between applications, the healthcare industry has lagged behind because of the sensitivity and complexity of the data involved. In fact, some healthcare data are still transmitted in physical format, impeding providers’ ability to benefit from integration and automation.

What is HL7?

Health Level Seven (HL7) is a range of international standards designed to address this challenge. First introduced in 1989, the standards were created by Health Level Seven International, a group of technology and healthcare leaders with the goal of providing better hospital workflow support. HL7 has provided a standard set of patient attributes and clinical events to improve interoperability in healthcare.

What is the FHIR Standard?

Fast Healthcare Interoperability Resource (FHIR) is the most recent version of HL7.

The FHIR specification defines standards for healthcare data exchange, including how healthcare information can be shared between different computer systems regardless of the way it is stored. The FHIR standard describes data elements, messaging and document formats, as well as an application programming interface (API) for exchanging electronic health records (EHRs) and electronic medical records (EMRs). FHIR is open source, providing open APIs that enable continuous real-time data exchange.

What are the benefits of FHIR?

FHIR makes it simple for patients to manage their care, even if they see multiple providers in different healthcare organizations and use multiple plans (multiple payers using multiple EHRs). By creating a unified, single personal patient health record that integrates data from different formats, FHIR standards deliver a complete view of patient information to improve overall care coordination and clinical decision support. Everyone benefits from more effective, personalized, integrated and cost-efficient healthcare solutions.

What are the differences between FHIR and HL7?

FHIR draws on previous standards such as HL7 Version 2 (V2) and HL7 Version 3 (V3) and uses common web standards such as RESTful APIs, XML, JSON and HTTP. Using REST APIs makes FHIR more efficient as it allows data consumers to request information on demand, rather than subscribing to a feed that shares all data whether or not it is needed immediately (as was the case in earlier versions of HL7).

The HL7 FHIR REST API can be used with mobile apps, cloud-based communications, EHR-based data sharing, real-time server communication and more. Using FHIR, software developers can develop standardized browser-based healthcare applications that allow users to access clinical data from any health care system regardless of the operating systems and devices used.

FHIR is easier to learn and implement than earlier versions and provides out-of-the-box interoperability. The FHIR standard also allows for different architectural approaches that can be used to gather information from a modern or legacy system.

Is FHIR compatible with HL7?

While FHIR is compatible with HL7 V2 and CDA standards, organizations should migrate to FHIR to take advantage of the new direction for health information data exchange.  However, many providers still rely on prior versions of the HL7 standard, leaving some IT teams unsure if they should rewrite existing applications for HL7 V2 or replace them.

IBM® Integration and FHIR

Our application integration solution, IBM App Connect, has the power to transform HL7 to FHIR bi-directionally without the need to rewrite existing applications. It can move healthcare data from system to system, including to an EHR acting as an FHIR server.

IBM App Connect for Healthcare is a specialized version of IBM App Connect for the healthcare industry. It offers pre-built patterns that provide smart FHIR transformation and routing. The patterns can convert FHIR into any other format, which creates opportunities for healthcare organizations to realize the benefits of FHIR and explore the latest integration methods, including event-driven architectures. Health IT providers can use IBM API Connect to extend the reach of these FHIR resources for multiple use cases with the ability to create, manage, secure and socialize FHIR APIs.

Learn more about IBM’s FHIR and HL7 implementation

Visit the IBM App Connect product pageSource: IBM Blockchain

Creating a sustainable future with the experts of today and tomorrow

When extreme weather strikes, it hits vulnerable populations the hardest. In the current global climate of stronger and more frequent storms, heat waves, droughts and floods, how do we build more positive environmental and social impact? We have a responsibility to apply our technological expertise, resources and ecosystem to help the world become more resilient to these environmental challenges.

We need a three-pronged approach to long-term sustainability: preparing the workforce with skills for a greener future; forging strategic cross-sector partnerships; and empowering purpose-driven individuals and organizations with the right tools and technology to accelerate action.

Equipping the current and future workforce with green skills

According to new Morning Consult research commissioned by IBM, 71% of business leaders surveyed anticipate their business will emphasize sustainability skills criteria in their hiring in the next two years, with 92% expecting to invest in sustainability training in the next year. There is already a skills gap in technology and sustainability, and these results show that it continues to grow.

But when it comes to training and credentials in green and technology skills, there just aren’t that many options. IBM already has a strong track record of providing free skilling resources to communities that are underrepresented in tech, most recently with a commitment to skill 2 million learners in AI. So, to help prepare the experts of tomorrow with the green and technology skills they need, we are providing free training on IBM SkillsBuild.

Our initial curriculum offerings will include three courses: Sustainability and Technology Fundamentals, Data Analytics for Sustainability and Enterprise Thinking for Sustainability. Through these foundational courses, learners will explore topics like ecology, biodiversity and social impact to help them develop a comprehensive understanding of sustainability. 

Lessons will include real-life case studies and opportunities to learn about how AI can assist businesses in achieving sustainability goals and mitigating climate risks. The courses also provide instruction in data analytics contextualized around sustainability use cases. We will also add more advanced courses that take a deeper look at how data analysis and visualization skills can be applied to practical sustainability use cases, such as examining energy consumption in a community. 

These courses are available to high school students, university students and faculty, and adult learners worldwide. Learners are free to take as many courses as they want and to study at their own pace. Upon successful completion of some of these courses, learners receive a credential that is recognized by employers.

IBM SkillsBuild has a global reach, and it has already benefited many learners with the inspiration and resources they need to pursue careers in technology. For instance, in Nigeria, Clinton Chidubem Amam found employment as a graphics designer after completing IBM SkillsBuild courses, and his work was displayed at the World Economic Forum in Davos earlier this year. Meanwhile, Oscar Ramirez, who arrived in the US as a child from Mexico, was able to investigate everything from AI to cybersecurity and project management while finishing his studies in Applied Mathematics and Computational Mathematics at San Jose State University.

Uniting sustainability experts in strategic partnerships

Whether it’s closing the green skills gap or tackling environmental challenges, you can’t go at it alone. Addressing big challenges requires collaboration and strategic partnership with experts that intimately understand the nuances of different domains.

That’s why IBM’s award-winning pro-bono social impact program, the IBM Sustainability Accelerator, selects innovative organizations focused on solutions worth scaling. In this program, diverse cross-sector experts in topics such as sustainable agriculture and renewable energy come together from both inside and outside IBM. Using a human-centered approach along with IBM Garage, artificial intelligence, advances in data, cloud and other technologies, these teams collaborate on projects to help vulnerable populations become more resilient to climate change.

Five organizations are now joining this initiative on the path toward clean water and sanitation for all (UN SDG6):

  • The University of Sharjah will build a model and application to monitor and forecast water access conditions in the Middle East and North Africa to support communities in arid and semi-arid regions with limited renewable internal freshwater resources.
  • The University of Chicago Trust in Delhi will aggregate water quality information in India, build and deploy tools designed to democratize access to water quality information, and help improve water resource management for key government and nonprofit organizations.
  • The University of Illinois will develop an AI geospatial foundation model to help predict rain fall and flood forecasting in mountain headwaters across the Appalachian Mountains in the US.
  • Instituto IGUÁ will create a cloud-based platform for sanitation infrastructure planning in Brazil alongside local utility providers and governments.
  • Water Corporation will design a self-administered water quality testing system for Aboriginal communities in Western Australia.

We’re excited to partner with organizations that deeply understand the water and sanitization challenges that communities face. IBM has committed to support our sustainability accelerator projects, including our sustainable agriculture and clean energy cohorts, with USD 30 million worth of services by 2025.

Supporting a just transition for all

To build a more sustainable world, we must empower communities with the skills, tools and support they need to adapt to environmental hazards with resilience. By providing access to IBM technology and know-how, we can empower the communities most vulnerable to the effects of extreme weather and climate change. And by democratizing access to sustainability education through IBM SkillsBuild, we help the next generation of experts realize their passion for applying advanced technology to preserve and protect the environment. These efforts, along with our strategic partnerships, will lead us all into a more sustainable future.

Learn how you can collaborate with us to create a sustainable futureSource: IBM Blockchain

Watsonx: a game changer for embedding generative AI into commercial solutions

IBM watsonx is changing the game for enterprises of all shapes and sizes, making it easy for them to embed generative AI into their operations. This week, the CEO of WellnessWits, an IBM Business Partner, announced they embed watsonx in their app to help patients ask questions about chronic disease and more easily schedule appointments with physicians.

Watsonx is comprised of three components that empower businesses to customize their AI solutions: watsonx.ai offers intuitive tooling for powerful foundation models; watsonx.data enables compute-efficient, scalable workloads wherever data resides; and the third component, watsonx.governance, provides guardrails essential to responsible implementation. Watsonx gives organizations the ability to refine foundation models with their own domain-specific data to gain competitive advantage and ensure factual grounding to external sources of knowledge.

These features—along with a broad range of traditional machine learning and AI functions—are now available to independent software vendors (ISVs) and managed service providers (MSPs) as part of IBM’s embeddable software portfolio, supported by the IBM Ecosystem Engineering Build Lab and partner ecosystem.

The watsonx platform, along with other IBM AI applications, libraries and APIs help partners more quickly bring AI-powered commercial software to market, reducing the need for specialized talent and developer resources.

A platform prioritized for enterprise AI

IBM is focused on helping organizations create business value by embedding generative AI. Watsonx provides the functionality enterprise developers need most, including summarization of domain-specific text; classification of inputs based on sentiment analysis, threat levels or customer segmentation; text content generation; analysis and extraction (or redaction) of essential information; and question-answering functions. The most common use cases from partners often combine several of these AI tasks.

ISVs need the flexibility to choose models appropriate to their industry, domain and use case. Watsonx provides access to open-source models (through the Hugging Face catalog), third-party models (such as Meta’s Llama 2) and IBM’s own Granite models. IBM provides an IP indemnity (contractual protection) for its foundation models, enabling partners to be more confident AI creators. With watsonx, ISVs can further differentiate their offering and gain competitive advantage by harnessing proprietary data and tuning the models to domain-specific tasks. These capabilities allow ISVs to better address their clients’ industry-specific needs.

Let’s explore a few AI use cases that span different industries. 

Exceptional customer care through AI solutions

Today, customers expect seamless experiences and fast answers to their questions, and companies that fail to meet these expectations risk falling behind. Customer service has leapfrogged other functions to become CEOs’ top generative AI priority. Given this trend, companies should be looking for ways to embed generative AI into their customer care portals. To accelerate this process, companies can implement AI-infused customer care commercial solutions. IBM’s embeddable AI technology, such as IBM watsonx Assistant and watsonx.ai, allows ISVs to quickly and easily build AI into their solutions, which in turn helps them to reduce time to market and reach their customers sooner.

Watsonx allows enterprises to effortlessly generate conversation transcripts with live agents or automate Q&A sessions. With watsonx.ai, they can obtain concise conversation summaries, extract key information and classify interactions, such as conducting sentiment analysis to gauge customer satisfaction. This information will further refine and improve the information available to the agents.

Streamline your procurement process using watsonx

By embedding AI technology in enterprise solutions, organizational leaders can connect disparate, broken processes and data into integrated end-to-end solutions.

For example, supply chain management can be a challenge for companies. The process of changing suppliers can be a time-consuming and complex task, as it requires intensive research and collaboration across the organization. Instead of spending cycles and resources on creating an in-house solution that streamlines this process, companies can implement an AI-infused supply chain management solution developed by ISVs. ISVs are experts in their domain and build their solution with enterprise-grade AI–such as watsonx Assistant, watsonx.ai, and watsonx.data–so companies can feel confident in their selection.

Watsonx Assistant can serve as a user-friendly, natural-language Q&A interface for your supplier database. In the background, watsonx.ai generates database queries and content like Requests for Proposals (RFPs) or Requests for Information (RFIs), while Watson Discovery analyzes supplier financial reports. Watsonx.data acts as a front end for the company’s ERP system, with up-to-date attributes about inventory items, ratings of suppliers, quantities available and so on, along with a third-party data warehouse providing further decision criteria. Thus, teams can work smarter and move toward better, more integrated business outcomes. 

Watch the demo of these use cases, or explore interactive demos in the IBM Digital Self-Serve Co-Create Experience.

Partner success stories

WellnessWits is using watsonx Assistant to create a virtual care solution that connects patients to chronic disease specialists–from anywhere. The platform features an AI-powered chat functionality that can help patients gather information and answers about their chronic disease and facilitates personalized, high-quality care from physicians that specialize in their condition.

Ubotica is leveraging IBM Cloud and watsonx.ai in its CogniSAT platform, enabling developers deploy AI models to satellites for a wide variety of observational use cases such as detecting forest fires or space junk. CogniSAT improves the efficiency with which data is stored and processed, providing edge-based analysis onboard satellites.

IBM solution provider Krista Software helped its client Zimperium build a mobile-first security platform using embedded AI solutions. The platform accelerates mobile threat defense response by automating ticket creation, routing and software deployment, reducing a 4-hour process to minutes.

Benefits of building with IBM

ISVs who partner with IBM get more than just functionality. Our team will help you create a solution architecture that helps you embed our AI technology, explore how to monetize your solution set, provide technical resources and even help sell it through our seller network.

IBM Partner Plus, our partner program, provides business partners with a plethora of resources and benefits to help them embed technology. We find the following resonate especially well with partners looking to start their journey of building with IBM: the IBM Digital Self-Serve Co-Create Experience (DSCE), the IBM Ecosystem Engineering Build Lab and the IBM Sales Partner Advocacy Program.

DCSE helps data scientists, application developers and MLOps engineers discover and try IBM’s embeddable AI portfolio across watsonx, IBM Watson libraries, IBM Watson APIs and IBM AI applications. The IBM Ecosystem Engineering Build Lab provides partners with technical resources, experts and support to accelerate co-creation of their solution with embedded IBM technology. The IBM Sales Partner Advocacy Program is a co-sell benefit that encourages collaboration with IBM sales teams when partners sell their solution with embedded IBM technology to IBM clients.

Explore how your company can partner with IBM to build AI-powered commercial solutions today.

Explore AI-powered commercial solutions with IBMSource: IBM Blockchain

IBM named a Leader in The Forrester Wave™: Digital Process Automation Software, Q4 2023

Forrester Research just released “The Forrester Wave™: Digital Process Automation Software, Q4 2023: The 15 Providers That Matter Most And How They Stack Up” by Craig Le Clair with Glenn O’Donnell, Renee Taylor-Huot, Lok Sze Sung, Audrey Lynch, and Kara Hartig and IBM is proud to be recognized as a Leader.

IBM named a Leader

In the report, Forrester Research evaluated 15 digital process automation (DPA) providers against 26 criteria in three categories: Current offering, Strategy and Market presence.

IBM received the highest scores among all vendors in the Market presence category and the highest ranked scores in AI-led process transformation tools, tooling for process automation and among the highest scores in the ability to meet and govern use cases criteria in the Current offering category. In addition, IBM received the highest possible score in vision, innovation and partner ecosystem in the Strategy category.

You can download a complimentary copy of the full Forrester Wave™ report to learn more about IBM and other vendors’ offerings. 

Intelligent automation and deep expertise with IBM

IBM has embraced the convergence of AI and business automation, focusing on providing an AI-first framework of intelligent automation in our offerings. Intelligent automation allows customers to leverage digital scale to improve business operations, provide better customer experiences and free employees to do higher-level work.

The Forrester report recognizes IBM Cloud Pak for Business Automation when it comes to AI asset maturity. The report states, “IBM brings together AI assets with automation smarts for deep deployments.” In addition, Forrester says that “IBM has one of the stronger DPA governance solutions in the field.”

The Forrester report also acknowledges IBM’s experience stating, “Look to IBM for sophisticated use cases that require a wide breadth of DPA functionality and deep industry expertise.”

About IBM Cloud Pak for Business Automation

IBM Cloud Pak for Business Automation is a modular set of integrated software components that automates work and accelerates business growth. With this solution, customers can transform fragmented workflows — achieving 97% straight-through processing — to stay competitive, boost efficiency and reduce operational costs.

With IBM Cloud Pak for Business Automation, organizations can simplify complex workflows, build low-code and no-code automations with the help of AI and gain deployment flexibility.

Learn more with IBM

Learn more about how IBM’s intelligent business automation offerings can propel your organization into 2024.

Download your free copy of the Forrester Wave™ report Learn more about IBM Cloud Pak for Business Automation Get a 30-day free trial of IBM Cloud Pak for Business AutomationSource: IBM Blockchain