5 True and False Mainframe Facts Your Business Should Know

Andrei Leontiev
Director, Large-Scale Business Technologies Division

IBA Group

When you think of a mainframe what thought goes through your head? Large, on-premise? Racks and rows of servers housed on several data centre floors? The truth is, mainframes have evolved. In this article we dispel 5 mainframe inaccuracies and reveal some truths behind the nature – and future – of mainframes.

False: Mainframes are best suited for onsite computing environments

True: Mainframes can enable scalable, hybrid cloud architectures

Mainframes were traditionally seen as on-premise computers used primarily by large enterprises for critical applications and transaction processing.

Today, more than 71 percent of Fortune 500 companies still use mainframes. They also account for 68 percent of global IT workloads. Many of these organisations have deployed hybrid applications that include on-premise, private cloud, third-party public cloud and mainframes.

The truth is, the massive computing power, storage and scalability features of mainframes make them ideal central processing units for hybrid cloud architectures. Organisations are realising the benefits of greater flexibility by enabling workloads to move between private/public clouds and mainframes as their computing needs and costs change.

Another advantage mainframes offer in a hybrid environment is simplicity. Complex, multi-cloud environments that span a variety of on-premise and cloud resources run a high risk of failure. But mainframes, such as the IBM Z series, deliver broad functionality – simply, efficiently and inexpensively. Mainframes are, therefore, more relevant in a hybrid world and can become the central point of complex, multi-cloud and on-premise solutions.

False: Only on-premise mainframes offer the best security

True: Cloud-enabled mainframes are now more secure than ever

As the world becomes more digitised and interconnected, mainframes remain one of the most stable, secure and mature environments to support IT initiatives. In fact, IBM states that 87 percent of all credit card transactions and nearly US$8 trillion in payments a year are processed on mainframes. Next-generational technologies, such as blockchain, already utilise mainframes for compute-intensive tasks and storing data.

But are cloud-enabled mainframes as secure as their on-premise counterparts?

It would seem so, with big tech vendors such as IBM releasing secure, cloud-ready mainframes. These mainframes include robust security with pervasive encryption, powerful analytics and machine learning. This is critical in a world of increasing cybersecurity concerns, viruses and malware.

Today’s cloud-ready mainframes are capable of processing billions of fully encrypted transactions a day. And they are designed to provide industry-leading security delivered in on-premise environments within public cloud data centres.

As a result, CIOs and tech specialists can provide cloud-ready mainframe environments to their users without compromising trust, while also meeting industry compliance regulations.

False: Mainframes function only with onsite IT and support staff

Truth: Virtual support teams can provide mainframe as-a-service

Mainframe talent constraints, combined the with cost of replacing tech staff outright, are forcing CIOs to make a choice: manage existing mainframes with available on-site IT staff, or consider mainframe as-a-service solutions. The days of managing mainframes with a centralised IT department are dwindling, as is locally-sourced talent.

Mainframe as-a-service (MaaS), on the other hand, enables organisations to tap into mainframe vendors that provide all the necessary IT infrastructure and support. These service providers normally cover all the costs of maintenance and upgrades which translates into significant cost- and risk-avoidance. In fact, most MaaS end-users only pay for the compute, storage and batch time consumed in normal operations. And MaaS enables them to scale up or down according to their changing requirements.

The result: CIOs can scrap the nerve-jangling task of maintaining aging mainframes.

MaaS also offers assurances of business continuity. Most service providers offer extensive disaster recovery and redundancy, including mirrored sites, which many organisations find cost-prohibitive.

SLAs enable MaaS vendors to provide very accurate predictions of the cost of deployment, configuration, training, managed services and preventive maintenance for mainframes. In addition, most MaaS vendors offer 24/7/365 support.

The advantages of virtual teams hired to manage mainframes include the ability to tap into geographically dispersed, highly qualified talent. On-site IT staff can augment their mainframe know-how with knowledge transfers from experienced foreign colleagues. Organisations can then delegate certain functions to virtual teams while maintaining business-critical or sensitive activities on-site.

False: Mainframes require regular, manual updates and maintenance

Truth: Apps can provide automated, real-time mainframe support

Traditional mainframes required regular, consistent on-site monitoring and manual, batch updates. The task of keeping them well serviced and running smoothly became onerous which resulted in long incident management chains and service downtimes.

But today’s mainframes can be supported remotely and automatically with business applications on mainframe servers. These apps can digitize, accumulate and share the experience of support teams.

They can also work remotely to manage and maintain mainframe systems to ensure business continuity. This includes monitoring distributed teams and apps from one entry point. The result is a clear picture of the state of business applications, staff actions and mainframe software and hardware.

Embedded AI modules can spot operational support problems in real-time and generate and suggest solutions. Tickets can be automatically assigned for relevant actions on problems.

By blending human experience and expertise with automation and AI-enabled apps, mainframes are starting to bridge the legacy of the past with the landscape of the future.

False: Mainframes take up huge amounts of floor space and are becoming irrelevant

Truth: Mainframes are the size of your fridge, and mainframe capabilities are expanding

The old, workhorse mainframes of the past took up hundreds of metres of data centre space. And they incurred costly electricity and utility fees, with cumbersome airconditioning units consuming vast amounts of energy.

Today’s new mainframes do not require special space, cooling or energy. They are also smaller than some of the earlier “big iron” machines (remember the first computers that would fill the space of an entire room?). With standard 19” racks, the latest mainframes seamlessly coexist with other platforms in a data centre. A single-frame unit requires 75 percent less floor space and reduces power consumption by 40 percent. To put them in perpective, these mainframes are the size of your fridge.

Mainframes have always been capable of hosting business-critical applications such as Enterprise Resource Planning (ERP), finance and accounting and big data transactions. However, more organisations are starting to expand the capabilities of mainframes to host and compute next-generational technologies based on AI, blockchain and the Internet of Things (IoT).

Top-end mainframes can now process up to 800 billion transactions daily. As such, modern mainframes will continue to play a significant role in highly digitised, automated and hybrid infrastructures.

DevOps for Mainframe

Irina Kiptikova
Corporate Communications Director
IBA Group

In our earlier video blogs on mainframe systems, we discussed why the mainframe topic is again so hot and how we can automate support of mainframe business applications.

This time we are focusing on DevOps for mainframes.

I invited Yuliya Varonina, one of IBA’s leading DevOps engineers to enlighten us on the subject because I know she is a real DevOps evangelist. The technology writer and analyst, Mark Hillary, is with us once again to help Yuliya share her ideas.

Mark Hillary: “When I started developing code in the 90s, there was no concept of DevOps.”

When I started developing code in the 90s, there was no concept of DevOps. I’d write my code, manually compile it, test it, and if everything looked OK, arrange when I was going to move the test code into production. After that a separate operations team would support it.

DevOps comes from the combination of software development and IT operations – development and operations shortened to DevOps. It’s really a set of practices that combines the development process with ongoing operations and support.

I left coding and started writing about technology before I ever worked in a team using DevOps so I’m happy to speak to a genuine DevOps expert to find out how it works and also how can this be applied to the mainframe environment.

Yuliya Varonina: “All mainframe vendors and IT organizations already build their tools with an eye on DevOps.”

  • What is DevOps?

One question but a billion answers. Some people say that it’s about cool tools, others are sure that it’s about methodology and ways of working. Another group of people says that it’s about culture and all of them are right.

Let’s imagine that you need to meet your customer’s requirements quickly, piece-by-piece and with the highest quality. What are you going to do? You will try to collaborate with your team to find the best tools and a new methodology like Agile. You will also try to automate all repetitive manual tasks to achieve the goal of your customer. 

So DevOps is a culture of automation and collaboration. It’s a set of the best engineering practices complemented by mindset changes.

  • How does DevOps work in the mainframe environment?

Organizations that have mainframes are not an exception. Just like others they need automation and collaboration. All mainframe vendors and IT organizations already build their tools with an eye on DevOps.

Mainframe is a modern computer technology mixed with supporting applications created in the previous century. Mainframe developers today have a possibility to work using the best engineering practices during their z/OS code development.

We can write mainframe code using the most popular Integrated Development Environments – IDEs. The number of DevOps tools and Plugins available for the mainframe platform has been growing every day.

So currently we think that the question “How DevOps belongs to the mainframe?” is not valid, because all development teams around the world accepted the DevOps transformation and agreed with its advantages.

  • How do we build DevOps for mainframe?

We built pipelines for the mainframe code inside our organization using the tools, which were available for us. We tried to integrate all of them into one pipeline. Initially, it was mostly based on IBM solutions, such as IBM Urban Code, and then we scaled to use more and more open source technologies.

Now, we also use Jenkins as a pipeline engine for some of our projects. It was not a one day decision. We started with automation of a small scope of manual operations and kept expanding. We saved time for innovation and then came up with the idea of ​​a pipeline constructor and DevOps as a service.

  • What is DevOps as a service?

We talk about DevOps as a Service, when we talk about our flexible pipeline architecture. We understood that we can’t build a common CI/CD pipeline for all mainframe organizations and share it as a best practice. There’s no silver bullet. We need to be flexible enough and organize the technical part of DevOps for every organization as something unique. Because each organization has its own set of platforms, infrastructure, culture, and rules. So we need to be ready to use the integration engines, bug tracking systems, and test management tools available at a customer.

We also need to take into account programming languages, test automation frameworks, and a hundred of other issues to prepare the best technical solution. We can share the architecture we built for our organization, but we know that it should be built for each customer in a unique way, just like a children’s construction set with the same cubes assembled in a different order.

We at IBA played with our cubes for mainframe products, and now we know how we can help others take the path of DevOps. We research the infrastructure, tools, and challenges and can advise how to rethink all of these to be technically close to DevOps. We can also give advice about cultural changes, based on our own experience. I don’t want to go into detail though. As Irina said, I am a DevOps evangelist, and I can talk about DevOps for hours.

This is our third blog post in a series of video discussions on mainframe systems. Please share your thoughts about the discussion or offer your topics for future videos by leaving your comments or suggestions here.

Managing Mainframe Business Continuity

Irina Kiptikova
Corporate Communications Director
IBA Group

This video blog is a continuation from our earlier vlog on mainframe systems, where we mentioned that the mainframe platform has been changing, as well as the mainframe service models.

Oleg Lapushanski, Head of the Multiplatform Technologies Department at IBA, tells us about the new mainframe service models, and how he solves mainframe support problems using automation and new experience management methods.

Mark Hillary, our blogger and a well-known industry analyst in the area of technology and customer experience, helps Oleg share his ideas.

Mark Hillary: “After the first webinar, I wanted to learn more about how mainframes are managed in 2020.”

I recently participated in a webinar hosted by IBA focused on why mainframes have suddenly returned to the top of the news agenda.

Governments, banks, and airlines have been overwhelmed by the effect of the Covid-19 coronavirus pandemic globally. All of these different business sectors rely on mainframe computers – to process citizen unemployment claims, banking transactions, or ticket allocations.

We might not realize it, but there are still a lot of mainframes out there working for many of the brands we use on a daily basis. After the first webinar, I wanted to learn more about how mainframes are managed in 2020, so I’m pleased to have Oleg from IBA on the line to answer some more questions.

Oleg, I know that you have done a lot to bring mainframe management into the 21st century. How did you originally get the idea that maybe some of the applications can be automated? What value does this approach bring to the clients you are working with? Do you have a way of capturing problems and experience so the system can learn for the future?

Oleg Lapushanski: “I connected my professional career with mainframes and realized that we can improve a lot.”

Managing operations in increasingly complex systems gets harder with each passing day. Limited resources, poor experience management, distributed teams, lack of proper Z resources, difficulties with involving young people in the Z area…

How to make operational problem resolution less stressful, and how to use knowledge and experience of Z community? Answering these questions, we came to the idea to develop an automated support management system and called it APPULSE.

We combined Incident Management and Experience Management with Operations Support for business applications that run on z/OS servers and use different IBM subsystems, including IMS, CICS, Db2, MQ, TWSz and nFTP. As a result, we can identify operational problems faster and at an early stage, resolve them quicker, and achieve business continuity. The support personnel gets free time, which they can use for transformation and development projects.

The system can learn for the future. In case a problem is new, the support person must resolve it from APPULSE and then perform incident review, that is formalize this kind of operational problems, creating an incident template. All these incident templates get into the neural network and are extracted when a similar incident occurs. The real incident values are added as parameters on the fly. So, we need to review the solution prompted by APPULSE, to validate it, and to execute it. 

This way, support people enrich the APPULSE solution database with new types of problem solutions.  Digitizing, accumulating, and sharing team experience, we achieve higher performance and help newcomers get into the operations support much quicker.

This is our second blog post in a series of video discussions on mainframe systems. Please share your thoughts about the discussion or offer your topics for future videos by leaving your comments or suggestions here.

Mainframe: Our Heritage or Present?

Irina Kiptikova
Corporate Communications Director
IBA Group

On May 11, we launched our first video blog. We selected mainframe computer systems for the video discussion.

Why mainframe? Again mainframe?

IBA Group has a long history of mainframe software development. This history begins even before the company was officially registered. Roughly 30 percent of our team members develop and support mainframe applications. Why is this subject suddenly so hot today and being discussed everywhere from Bloomberg to CNN?

I invited Andrey Lepeyev, Director of the Mainframe Department at IBA Group, Mark Hillary, our blogger and an author of 17 books on technology and customer experience, and Peter Ryan, a prominent analyst in customer experience management and BPO, to answer these questions.

Mark Hillary: “Big companies are still using these enormously powerful devices.”

Mark Hillary, our blogger and an author of 17 books on technology and customer experience, said:

When I studied software engineering in the late eighties and early nineties, I had to study the difference between mainframe systems and the IBM PCs that were taking over the world at that time. We did use a Prime mainframe, but even 30 years ago, many of us felt that the mainframe was a dinosaur because computing power was coming to every desk.

You might think that this is even more true today, because we all carry around more computing power in our phones than NASA had back when they put a man on the moon. But across the world, mainframes are still out there processing data for airlines, government agencies and highly regulated industries, such as banks, where rapid change is difficult.

IBM released their new Z15 just 7 months ago and it has the power to process a trillion web transactions a day. You might think that large-scale computing power has all migrated to the cloud, but some companies are still using these enormously powerful devices. For many organizations, it is just too difficult to migrate away from their mainframe.

The problem is that computer science students today don’t bother studying mainframes. Mainframes are just something from the history books. Students want to build mobile apps, games, or code for computers that are widely used. I tried using Google earlier today to find a computer science course that included mainframes and I couldn’t find any – if they do exist then they are well hidden.

This means that there is a scarcity of resource for maintenance and improvement. That’s what we are going to talk about today because there are still some areas of the world where mainframe expertise can be found.

The mainframe business is alive and well, as IBM announces new z15

Peter Ryan: “Reliance on mainframes has been highlighted by the pandemic. These systems need to be managed more effectively.”

A related problem is that much of the business software developed to help airlines manage ticket sales or banks process transactions was written in COBOL – which means Common Business Oriented Language. It’s a computer language that rose in popularity along with mainframes – and then declined as the mainframes also declined.

So we are in a situation where it is hard to find people who can maintain the machines, but it is also hard to find people who can change the software too. Most software developers with COBOL experience have retired – it is no longer commonly studied in university.

The Covid-19 coronavirus pandemic has vividly demonstrated why we need mainframe and COBOL expertise – many government systems cannot be easily changed without this knowledge and expertise. The $2.2 trillion CARES Act passed in the USA in March included a $600 week increase in unemployment benefits for American citizens, but until the individual states can update their systems to reflect the new law nobody is getting that extra cash.

In Oklahoma it is taking over 2 weeks to process unemployment claims. Newly unemployed citizens have a right to these extra benefits during the crisis and yet because the mainframe systems are so slow and cumbersome – and COBOL programmers are so scarce – people are not being helped.

The last time Gartner counted COBOL experts in the USA was 2004. They counted 2 million back then, but also noted that each year 5% of those experts retire. My math may not be great, but that means they are now almost all retired.

There are mainframes out there delivering critical infrastructure and they are using billions of lines of COBOL code that very few people now understand. The problem is made worse because anyone who studies COBOL today would find themselves focusing on maintenance work – it’s not as inspiring as launching a new app.

This reliance on mainframes and COBOL has been highlighted by the pandemic. These systems need to be managed more effectively.

An Ancient Computer Language Is Slowing America’s Giant Stimulus

Andrei Lepeyev: “Mainframe remains unbeaten in terms of security and input / output operation, which is important for transactional applications.”

Currently, we have more than 400 mainframe experts at IBA Group. Initially, we worked on classical tasks. Today, we work on mainframe modernization.

Mainframe remains unbeaten in terms of security and input / output operation, which is important for transactional applications. However, the mainframe platform has been changing and the service model has been changing too.

For example, the IBA’s APPULSE solution monitors operability of business applications that run on System Z and with the help of machine learning offers efficient solutions for problem situations. In addition, the solution implements experience management, which is very important because experts in this field are on high demand.

We are also very active in DevOps for mainframe. It is of great importance because flexibility in the change of functionality is often the most vulnerable part in mainframe applications.

We have been traditionally involved in projects on modernization of the architecture of mainframe applications. Initially, we developed applications from scratch and recently expanded to add open source solutions, such as Zowe. This includes modernization towards GUI (Graphic User Interface) and building of a flexible SOA (Service – Oriented Architecture).   

This is our first blog post in a series of video discussions on mainframe systems. Please share your thoughts about the discussion or offer your topics for future videos by leaving your comments or suggestions here.

IDC Suggests A Focus On Big Data And AI

IBA Group
Mark Hillary

Research company IDC has issued new guidance on IT spending in 2020. IDC suggests that global spending will be around 3% down on 2019, however, there are several areas of interest such as Artificial Intelligence (AI) and Big Data that will see an increase.

In general, the approach is that companies need to think strategically about how they will emerge from this challenging year. Digital Transformation and other major projects that might usually take several years to plan and implement are being approved and this is creating opportunities for companies with expertise in these areas.

I believe that we are going to see a wave of transformation this year that positions AI, machine learning, and Big Data analytics at the heart of how many companies are structured. Microsoft is a good example. They rebranded Office 365 to Microsoft 365 and added many new AI capabilities. It’s clear that technology companies see the future as an intelligent cloud.

Some companies realised the developing importance of Big Data and the tools required to analyze it some time ago. Alibaba in China is a good example. Originally a retailer the company first started using Big Data analysis to better understand their customers – after all if you get to the point where you can almost predict what a customer needs then you can dramatically improve the customer experience and also drive more sales.

Today, Alibaba offers a much wider range of services and this has largely been driven by their experience and expertise using Big Data. As the wave of transformation projects grows large in 2020 it will be the ability of brands to gather and analyze data that makes all the difference.

Conversely, gathering all this data requires planning and permission. Tools such as facial recognition have faced problems because of concerns over customer privacy, but these tools have found many new applications. A good example is the Brazilian car sharing company, Turbi. They allows customers to unlock a car based on facial recognition – it beats the usual tedious procedures required when renting a car.

Where these tools improve the life of end customers, they will be accepted and popular. Now brands must explore how they can use AI and Big Data to transform the service they are offering. These tools allow behavioral insight in real time.

It’s easy to imagine how important this could be for a wide variety of companies. Retailers can predict when customers will shop and what they will want. Telcos can flexibly offer different service packages to customers based on data use. Banks can reduce the need for collections by intervening early with customers who may be about to default.

Although IDC suggests that IT spending will be reduced this year, it will increase in these specific areas. Companies are looking to evolve into 2021 with a new approach and, for some, this will be a dramatic change. 2020 is the year of Big Data.

IoT Use Will Triple By 2022 – What’s Changing?

IBA Group
Mark Hillary

The Internet of Things (IoT) has been hyped for many years. A recent McKinsey report suggests that it has been talked about for at least 15 years, although that sounds a little early to me as many people were only just installing wi-fi networks 15 years ago.

The McKinsey report suggests that the number of companies using the IoT has increased from about 13% in 2014 to about 25% (in late 2019). The projected number of IoT-connected devices is 43 billion by 2023 – that’s about three times the number we saw in 2018.

Sensor technology is getting cheaper and more advanced as future investments in the sector continue. In fact McKinsey estimates that investment will continue to grow by at least 13.6% until 2022.

Early adopters are now delivering projects at scale and over 200 different types of corporate use have already been documented. We are seeing rapid developments in smart cities, smart cars, connected cars, and e-health. B2B companies selling products to other companies – such as machinery – are able to maintain a constant connection to their equipment now using IoT sensors and Digital Twin software. Most of this would have seemed impossible just 5 years ago.

McKinsey suggests that most of this IoT growth will remain inside the enterprise, although devices that are IoT-connected will see a fast uptake. This sounds correct and I can see IoT devices all around my own home now that were simply not available until recently.

One example is my Amazon Blink system. This system allows easy-to-install security cameras to be placed anywhere. The cameras are controlled by a central module that connects to a phone app, so it’s easy to video anything that moves when you are out, or to just check on a live camera feed. Similarly, I have a Furbo camera. This tell me when my dog is barking, allows me to see him, speak to him, and then even to throw a snack to him – all from a phone app.

Many of these applications may seem frivolous, but the important thing to note is that device connectivity is becoming ubiquitous. New cars are now almost all self-diagnosing problems and remaining connected to the manufacturer, constantly passing information on performance back to be checked.

The ongoing rollout of 5G will certainly facilitate an even wider use of sensors on just about everything. 4G develops connectivity issues when many sensors are close together, just like trying to use your phone at a rock concert when you are in a single place with 80,000 other people. As 5G frees us from this problem there will be many new applications, such as a blend of smart cities and smart cars – the vehicles and street furniture transmitting data in both directions.

We are already seeing immense growth in devices and applications that use the IoT, but I believe that 2020 and 2021 will really be the tipping point. Once 5G is more widely available the sky really is the limit – sensors will literally be all around.

SHARE Fort Worth: Retaining the Magic of Real World Communication

IBA Group
Nina Famichova
Corporate Communications Department

As Politico and Bloomberg wrote several weeks ago, the world will never be the same.

Distance learning and distance communication technologies are booming. One might suggest that business and professional communication will securely remain online.

However, we believe that one day we will be able to leave our laptops to find ourselves not virtually, but physically together, having a laugh over a cup of coffee or visiting old friends and partners on the other side of the globe.

With 25+ years of remote work experience behind, we have to admit that no online platform can transmit that inexplicable part of human communication, which always escapes definition. It is the magic of unpredictable, unstructured, and often illogical face-to-face interaction that inspires scientists, motivates employees and after all, makes the world go round.

In this context, we would like to remember one of the latest events that IBA Group’s team attended.


Riddle: what do the cowboys and the IBA Group’s mainframers have in common?

Answer: they all shake the dust off their boots and go to Fort Worth 🙂

Fort Worth is nicknamed Cowtown for its deep roots in the cattle ranching and this year it hosted the SHARE conference, one of the major events in the mainframe industry. Our DevOps team were among 1,300 participants who gathered to educate, share and connect. The conference featured 500+ technical sessions and hands-on labs on enterprise IT hot topics such as security, open source, DevOps, and cloud.

That is what our mainframers, Yuliya, Dzmitry, Tatsiana, and Valery, tell about the event.



We arrived in the Cowtown on February 22, at night. The Sunday morning was not an easy one after a 22-hour transatlantic flight. However, a freshly brewed coffee and several Zowe sessions cheered us up. The welcome reception at a real Texas rancho was a nice ending of the day. We were happy to meet our old friends and partners from past SHARE events and we were also pleasantly surprised to see many new faces this time.



On February 24, I was a bit excited, as it was my first time presenting with Dzmitry. Co-presenting is like doing the DevOps pipeline in a way: everything depends on the teamwork. In our case, the presentation pipeline managed to embrace many members of the audience who were very responsive and helped us turn our presentation into a discussion that continued at our booth.

Another challenge was the headline, DevOps for DevOps. It intrigued many attendees and the expectations were quite high. Having attracted the attention, we had to keep the suspense. Dzmitry’s sense of humor livened up the technical part. Here is my favorite joke (see the screen on the photo)

In addition to sharing a laugh, we aimed to show in our presentation that DevOps engineers are common developers, who also feel the need to have VCS for their processes, automatic build and auto testing for CI/CD pipelines.  It is especially true for the mainframe, where DevOps involves a great deal of custom code. That is what our team call self-DevOpsing 🙂

Overall, our presentation was well received; it was nice to get the feedback not only inside the SHARE network but also on LinkedIn and Twitter.

Tatsiana and Valery told about technical and cultural solutions for building DevOps on the mainframe in the presentation zDevOps: What We Do, How We Do on February 25. Many last year’s participants were looking forward to their presentation to find out about the project progress in the past four months.

My first conference experience


 I remember my first SHARE conference in Whittlebury last year. I was absolutely shocked by the scale of the event and the countless opportunities it presented. At first, I was just listening not daring to ask or express any ideas of my own in front of people whom I considered legends of the mainframe. This time, I was more confident. First, I came to realize that my ideas are appreciated and listened to, and secondly, I am not alone. It is great to feel the support of your team at such events.  It really feels like traveling with my family.



Booth 303 at the SHARE Technology Expo has become our second home in a way, a place where we gathered to discuss the presentations we attended, share the insights and fix appointments with other attendees.

 We also took part in several Discussion Lounge conversations that followed technical sessions. What I like about presenting at SHARE is that you are not expected to know all the answers when presenting. You might as well ask the questions to the best industry experts and get answers that you would otherwise never have found.

Other networking opportunities included themed evening receptions and networking breaks where one could relax and have a chat with colleagues from all parts of the world.

Face-to-Face vs Online


I think that with the development of online communication the offline contact will become even more important and valuable. Impressions and contacts are the most precious part that we bring home from real world conferences. Online chats are not able to replace a face-to-face contact. For example, our DevOps team had taken part in dozens of automation trainings online but the test automation framework was out of use until Tanya and Yuliya brought the hands-on experience from Share. Thanks to their energy and commitment, the test automation framework has become a part of our daily workload. Online screens cannot inspire and motivate you in the way the real speakers do; they do not transmit the charisma and do not give the sensation of a once-in-a-lifetime moment.

Expectations vs reality


I expected to get caught up on what’s going on in the industry, build new connections and reconnect with established ones. That was a perfect opportunity for me to see where the mainframe industry is heading. The scale of the event was impressive with over a thousand attendees and huge presentation areas. Despite the size of the event, it felt very cozy and homelike. In addition, Texan barbecue was  fantastic).


The conference was an excellent opportunity to get an insight of what is going on inside the “mainframe box”. The positive thing is that the audience at our sessions is getting bigger with every conference. This year we had a record number of 60 people at our presentation. The most rewarding part is to see that you are getting fans, people who come to your session at every conference. We were also glad to talk to our friends and partners from Compuware, the 21st century, Rocket and IBM.

Hot Topics


The trending topics this year are DevOps and open source. As Greg Lotko from Broadcom said in his session, “Connectivity and openness can release the power of mainframe and fuel innovation”. New technologies open up mainframe to the new generation of developers. With automation and DevOps, Devs and Ops start working as one team. As a result, you have better visibility, more frequent releases, and happier customers.

Hybrid cloud is another popular technology that makes the mainframe more affordable while maintaining the same security level.

Another trend that cannot be left unnoticed is the fast development of large open source projects like Zowe and small ones supported by a few enthusiasts.


Tatsiana and Yuliya

The most interesting presentations to my mind were those on DevOps and test automation. The presentation about Galasa framework Solve Your DevOps Pipeline Headaches With an Open Source Framework for Test Automation by William Yates from IBM UK Laboratories was a real blast. IBM developed the framework released in open source git-repository (see the link on the photo). Galasa makes the mainframe more accessible to non-mainframers. It is an excellent opportunity for young developers to play with the mainframe outside the native environment. Extensible nature and open source make it easy to integrate with other testing tools of your choice. At the moment, we are discussing the applicability of the framework to our projects.

Another interesting session was from Rosalind Radcliffe and Suman Gopinath from IBM, titled True Unit Testing for z/OS Application. They were talking about the importance of unit testing at lower levels on developers’ side. The more unit tests you have the less tests you will need at further levels. ZUnit is an automated testing tool delivered in IBM Developer for z/OS. I hope that with time, its scope will widen beyond Cobol and CICS and it will help attract more users.

The presentation Lean Mean Machine – Keeping the Lights on for Agile/DevOps by Jeremy Hamilton focused on how DevOps engineers should define and deliver the best value to the customer. For this purpose, Dev and Ops teams, and the customer need to keep close contact to level the expectations and desired outcomes on the way.

Remote work


We are one of the few mainframe teams in Belarus, most of our projects are carried out remotely as our major clients are from Europe and the US, including some of the biggest players in the mainframe market. Of course, 24/7 online contact with clients and project managers and Internet help you stay tuned to the industry’s updates, but only at SHARE events you get the unique feeling of being a part of the global mainframe community.

Future plans

Yesterday was the deadline for paper submission for the next Share conference in Boston. So, our team had a real DevOps party last night, as the best ideas are always the last to come. As of today, the organizers have no plans to modify, postpone, or cancel SHARE Boston and we are looking forward to plunging into a buzzing atmosphere of the real-world SHARE conference!

How to Work from Home Efficiently: An IBA Experience

IBA Group
Darya Kavaleuskaya
Corporate Communications Department

The novel coronavirus (COVID-19) is ravaging the world. Every day people and businesses around the world are doing their best to adapt to the new sad circumstances. Hard times require urgent and weighted actions. Keeping an eye on the epidemiological situation, IBA decided to send home the employees whose projects were suited for remote work.

Hence, the wheels were set in motion. People clutched to their laptops, Ubers arrived at the gate one after another, and the once-busy IBA Group campus started to feel quieter. Activities at the board game club stopped, lunch and coffee breaks became memories, and the number of shuttles between the IBA office and the nearest metro station decreased.

IBA Group’s campus feels quieter

IBA employees might find these new circumstances unusual but not breaking their spirit or killing their working mood. Some of them share their tips on how to stay safe and productive on social media, and the others boast the team spirit and preservation of office culture by conducting coffee breaks online. To stay healthy and active, the IBA fitness center is planning to broadcast fitness classes.

However, while working from home might be new for IBA Group’s employees, remote work in general is something that has been at the heart of IBA’s business processes for 27 years. As the company has successfully completed a huge number of outsourcing projects, our remote communications have been refined to the point where transition to working from home is quick and seamless. All it takes is installing the right software and transferring the equipment out of the office.

Here is a recap of what we did to ensure this smooth transition.

  1. Access to the IBA network. Using Check Point Mobile, IBA employees stay connected to the IBA Group network with full access to mail and other corporate resources
  2. Communication platforms. IBA Group employees conduct meetings and exchange project status updates via a number of apps and platforms, including Skype, Webex, and Zoom
  3. Proprietary cloud solution. The IBA Group’s ICDC cloud solution enables employees to share big files and allows them to create virtual workstations.

IBA Group conducts the transition to remote work gradually and in full compliance with customers’ requirements. We have fully prepared our infrastructure and business processes. According to our numerous customers in diverse locations, they have not noticed any change in IBA Group’s performance.

In a world where travelling is banned and the borders are closed, staying connected is essential. We continue to look at the coronavirus situation in all countries of IBA Group’s operation. If anything changes, we are ready to react quickly and efficiently.

Keeping the team spirit alive: Online coffee break

Becoming Strategic with RPA

IBA Group
Mark Hillary

As I have often commented on this blog, many critics of Robotic Process Automation (RPA) have spent years telling the world that it’s all hype. One of the actual reasons for this is that many of the benefits of automation are hard to quantify using traditional cost/benefit analysis studies. Given the amount now invested in RPA it is no longer possible to call it a hyped technology, but how are managers measuring the value of these projects?

Managers always take time before launching new projects to plan the Return On Investment (ROI) of any new investment – this is a basic requirement for any new spending – but a new book argues that most managers are not pricing the advantages of RPA correctly.

Yes, it’s a book. Leslie Willcocks, John Hindle and Mary Lacity are co–authors of ‘Becoming Strategic With Robotic Process Automation’. It is coming from the academic and business school side of this argument – that managers are introducing new technologies into the workplace, but still measuring ROI using old tools and assumptions.

More enlightened managers have started focusing on the Total Cost of Ownership (TCO) when investing in new projects or technologies. However, the authors advise that although this is a step in the right direction, it is still hard to capture the benefit of automation using this model so they have built their own and called it Total Value of Ownership (TVO).

The book argues that to date it has been hard to establish and state all the key benefits of RPA installations therefore this TVO model is required. In traditional business cases, only the hard financial costs and benefits are included, but RPA is a strategic tool and therefore the value it creates needs to also be captured. Managers need to understand that RPA is more like a platform on which other solutions can then be created – it is not a tool with a single use and single easily measurable value.

Describing the ideas around TVO, the authors say: “The idea here is to establish every major activity and monitor the five resource costs associated with each activity, across the  RPA life-cycle. An understanding of full costs will guide investment strategically, and galvanize commitment to gain substantial returns from the investment. For example, if managers knew the real initial cost of getting the data into shape for use by cognitive tools, they would become much more committed to driving out value from tool adoption.”

In their paper on ”Notes From The AI Frontier” (2018), the McKinsey Global Institute (2018) supports this assumption around value.  MGI suggests that by 2030 augmentation and substitution impacts of AI technologies will give a 14%  boost beyond 2018 economic performance. But the boost from the impact of AI technologies on product and service innovation and extension will be 24%. Three examples they give are expanding the firm portfolio,  increasing channels, and developing new business models. In terms of global GDP. McKinsey estimates the innovation impact of AI technologies as potentially  a 7% increase, representing $US six trillion, between 2018 and 2030.

I have yet to read the complete book, but the approach sounds fascinating. What the authors are really suggesting is that we should forget about RPA as an individual tool and stop trying to measure hard and direct benefits – the real value will be in how it completely changes what your company can do a decade from now. It seems to me that a similar argument about productivity could have been made thirty years ago when companies started using email. If managers were doing a cost benefit analysis comparing the cost of sending a letter and sending an email they really were missing the point. That’s where we are with RPA today.

Leslie Willcocks, John  Hindle and Mary Lacity are co–authors of ‘Becoming Strategic With Robotic Process Automation, published in October 2019 and available at www.sbpublishing.org. For supporting new videos and papers go to www.roboticandcognitiveautomation.com

Becoming Strategic with RPA

Can Microsoft Take The Lead in RPA?

IBA Group
Mark Hillary

The use of Robotic Process Automation (RPA) has exploded in recent years and although some analysts called much of this growth hype there have been many examples and case studies that show the RPA market really is developing fast. One of the most conservative commentators – Hfs Research – even noted that growth in 2018 exceeded their own estimates so this growth is real.

More recent estimates looking to the next few years have stated that we can expect to see market growth of just over 20% each year out to 2025. QY Research notes that there are some specific trends developing in the near future too:

  • Due to high technology development and the involvement of developed countries, North America accounts for the largest share in the market for robotic process automation.
  • Europe is one of the leading players on the robotic process automation market. Countries like the U.K., Germany, and Italy, due to the rich manufacturing and automotive industry are the major contributors to market growth.
  • Asia-Pacific is expected to emerge during the forecast period as the fastest growing market. Development of Asian countries and demand for consumer electronic goods compels manufacturers to implement in the manufacturing process a cost-effective technology.

But there is one big RPA story that I have not seen covered by many of the analysts and that is that there is a big new player in town – Microsoft. At present, there are over 20 commercial RPA providers around and the top 5 or 6 of them are scooping up most projects, but will Microsoft change the dynamic of which software companies are using?

VentureBeat published a very interesting analysis of the Microsoft plans and they believe it goes far beyond just RPA alone. RPA is starting to define how companies design the software they use (it’s more like a platform) so any company offering RPA solutions has the opportunity to also design other solutions too.

However, on the downside, the Power Automate system from Microsoft does not yet have as many features as the existing leading RPA software suppliers. Naturally, they will work hard to catch up, but at the same time, the leaders can keep improving and adding even more features.

Microsoft is coming from behind on RPA as the market is already quite well established. Many will also want to see them update their software, to move through a few versions before trusting it. For these reasons, I think that there is the potential for change in the RPA market, but if it happens it will be a couple of years from now. In 2020 it will be the existing important players that keep leading.

Can Microsoft Take The Lead in RPA?

Feel the Power of Open Source Hybrid Cloud

Pavel Shkilionak
Director, Open Source Hybrid Cloud
IBA Group

Since 2019, the hybrid cloud market has been a hype and it keeps attracting more and more new players. However, only three major hybrid cloud solutions are currently present on the market, namely AWS Outposts, Azure Stack, and Google Anthos.

The demand for ready-to-use on-premise Hybrid Cloud solutions primarily resulted from the need to have single-digit millisecond latency from end users or onsite equipment. For instance, in the healthcare sector there is a need to enable rapid retrieval of medical information by storing data locally. In manufacturing, there are SCADA systems and applications that need to be run closely to the factory floor equipment. Media and entertainment need to access the latest GPU innovations on premises for graphics processing, audio and video rendering, and running media applications.

All of those big three public cloud providers offer ready-to-use rack-based on-premise solutions. However, the most mature one is AWS Outposts that has clearly defined the delivery model and the pricing policy.

AWS offers Outposts as a rack-based box: whitebox on the front and blackbox inside. This blackbox is delivered on-premise to the customer’s data center where Outposts infrastructure and AWS services are managed, monitored, and updated by AWS just like in the cloud. The customer’s technical support team has no access to that blackbox. Instead, AWS will monitor it as a part of the public region and will automatically execute software upgrades and apply patches. If physical maintenance is required, AWS will reach out to schedule a time to visit your site.

AWS offers Outposts as a rack-based box

Outposts provides only a few core cloud services out of the huge number of well-known AWS public services. These core services are compute EC2 instances and EBS block storage. Additionally, Outposts provides Amazon ECS to orchestrate Docker containers, Amazon EKS to manage and run Kubernetes, and Amazon RDS to support MySQL and PostgreSQL database engines.

Regardless of the configuration, the pricing model is pretty clear and based on a combination of EC2 instances and EBS gp2 storage tiers. Pricing for these configurations includes EC2 instances and EBS storage, as well as delivery, installation, and maintenance.

ICDC Model 2020 Open Source Alternative

ICDC Model 2020 is a ready-to-use hybrid cloud solution

ICDC Model 2020 is a ready-to-use hybrid cloud solution that represents a fully managed service to deliver pre-configured hardware and software to the customer’s on-premise data center or co-location space. Model 2020 is designed to run applications in a cloud-native manner, without having to use any public data centers.

Like AWS Outposts, Model 2020 offers a rack-based box that looks like a blackbox from the front, and is crystal clear and transparent inside.

This rack-based box is also delivered on premise to the customer’s data center and connected to the local customer infrastructure to provide the best IT-as-a-Service, implementing DevOps and ITSM practices in accordance with ITIL V4. 

Similarly to AWS Outposts, the basic ICDC Model 2020 configuration provides core cloud services as follows:

ICDC Compute offers self-service provisioning of virtual CPU, memory, and graphics resources, as well as Marketplace, network management, and load balancing. It supports both x86-based and IBM Power architecture.

ICDC Openshift service is designed for management and orchestration of Kubernetes containers.

ICDC Storage provides a full range of object, block, and elastic file system capabilities. It runs on all-flash NVMe SSD drives and is used in ICDC Compute and Openshift services.

ICDC Compute

In addition to the basic configuration, Model 2020 provides an extended edition with major team productivity and collaboration cloud services:

ICDC Disk brings new possibilities for team collaboration, enabling an exchange of files and documents across any organization, while ICDC DevOps offers groupware that enables organization’s teams to work agile, securely and faster, bringing more value to corporate customers and providing IT services in accordance with ITIL V4.

Model 2020 is designed as a High Performance Computing (HPC) solution built on Lenovo ThinkSystem with NVIDIA Tesla V100 GPU and IBM Power 9 servers, as well as Cisco Nexus 10/25/100G network uplink options.

Model 2020 is designed as a High Performance Computing (HPC) solution

Given such characteristics, ICDC Model 2020 is a great ready-to-use solution for developing products and provisioning services for AI, ML, RPA, IoT, and Blockchain technology stacks.

Both configurations offer Compliance, Help Desk, and Monitoring services provided by the ICDC team as a part of support and maintenance offered under a regular subscription model. It includes regular updates for all ICDC cloud services as soon as the original open source software releases new versions.


The Open Source Hybrid Cloud Model 2020 is beneficial for customers with enabling applications that need to run on premises due to low latency, local data processing, or local data storage.

From a financial perspective, ICDC Model 2020 significantly increases savings on OpEx by moving cloud services back from public data centers to customer’s sites. In addition, it also reduces TCO for managing of IT services and infrastructure due to its completely open source architecture that helps to avoid vendor lock-in and limited IT agility.

Where Will Cloud Computing Go in 2020?

IBA Group
Mark Hillary

Forrester Research published a new research paper exploring how Cloud Computing will change and evolve in 2020. The public cloud market is expected to reach around $411 billion by 2022 so it pays for executives to be thinking ahead to the trends that will shape the market this year.

The research identified 5 key areas to watch:

  1. Change in the player landscape; large companies such as IBM and Oracle are likely to focus more on applications and development – withdrawing somewhat from cloud services. Alibaba will join AWS, Microsoft, and Google as a leading supplier of cloud services.
  2. SaaS vendors scrapping proprietary platforms; SaaS vendors need to focus on the applications they are offering, not the basic infrastructure of the cloud. It’s likely that more vendors will just use services such as CloudSuite on AWS.
  3. High-Performance Computing in the Cloud; HPC is expected to increase by at least 40% in 2020. It has traditionally been quite difficult to dynamically allocate (and reduce) resources fast enough to support HPC, but cloud management systems are now getting smart enough for it to be feasible.
  4. Service meshes; applications are being broken down into smaller and smaller components and these microservice groups are being managed using containers – managing many thousands of these simultaneously is a challenge.
  5. Cloud management focusing on security; security has always been on the radar for all the cloud management systems, but after the Capital One data breach – which was based on AWS – everyone is now paying far more attention to securing systems and data.

The Forrester research identifies some important trends, not least the creation of a ‘Big 4’ in cloud services. Alibaba is already a major player in China, but it will be interesting to see if they can develop their services in markets such as Europe and the US.

To my mind, the rise in HPC and focus on security are the two important areas to watch. It is becoming possible to do much more with cloud services than ever before and this will lead to more investment and growth, but as more sensitive data and services move to the cloud it will be important to maintain a focus on security – this is one trend that nobody can ignore.

Where Will Cloud Computing Go in 2020?

Can RPA Break Free of the Hype Cycle?

IBA Group
Mark Hillary

Robotic Process Automation (RPA) is one of the hottest areas in the technology industry right now, so why would one of the biggest RPA specialists, UiPath, be laying people off rather than hiring as fast as they can? This was the surprise news when UiPath fired 400 employees (from a total of about 3,200) even when the company was valued at over $7 billion after a recent healthy funding round.

UiPath has defended the cuts by saying that they were all strategic. A comment in VentureBeat points out that there are over 90 open jobs at UiPath right now and these cuts really just reflect the end of a period of manic growth – it’s time to step back and ensure that everyone is focused on strategic growth.

Speaking to Information Age, Phil Fersht, founder and CEO at HFS Research, said: “UiPath is realising to its cost that intelligent automation is a marathon, not a sprint. It pushed the hype around RPA far too aggressively.”

There are many opinions swirling around, but as these commentators suggest, growth in RPA has been extremely fast. There are many different suppliers all competing in the same market and it’s simply not feasible to have over 20 software companies all offering the same – or a very similar service.

It’s easy to see why RPA has been hyped by the media and the software companies supplying RPA systems. There is a huge growth in interest in this technology and there are a large number of suppliers all competing for market share. The first to grab a large proportion of market share is most likely to succeed.

I remember visiting IBA Group at the end of last year and talking to their RPA team about how to choose between the various software suppliers. In an ideal world, a client would engage a partner like IBA to implement an RPA project and the initial phase would be to evaluate which software would offer the best solution for the specific needs of that company.

Most of the time this structured approach did not appear to be possible because the software companies would rush in, build a pilot to demonstrate their capabilities, and the client would say ‘I like that’ and ask their technology partner to implement it. Clearly this is not ideal, even with mature technologies, let alone nascent ones that are evolving this rapidly.

There are enough case studies out there now to prove that RPA is not just hype, but it has been in the interest of the leading software companies to ensure it is hyped – excitement is good for business. However, as UiPath has found out, if you are constantly sprinting then you will eventually need to take a rest before continuing with the race.

I am sure that UiPath is going to be OK. They just needed to focus on their product a bit more – manic growth is what happens in every unicorn. The RPA market in general will also be OK – it’s proven and there are global case studies that demonstrate the value. Where we will see some change in the months ahead is in more measured thinking and reporting about the value of RPA – it’s time to end the hype and really just focus on the benefits. There is already enough to say without the hype anyway.

Can RPA Break Free of the Hype Cycle?

Cloud Spending

IBA Group
Mark Hillary

What is the fastest developing area of IT that is taking the biggest chunk of your budget? If you said Cloud Computing then you are right. Spending on cloud systems, both for storage and the use of applications, is soaring. A recent survey indicated that cloud now accounts for a quarter of all IT budgets globally.

That’s enormous – a quarter of all IT spending globally?

However, it’s not a surprise when you consider how cloud computing is expanding and becoming more important at different levels of the organization. Think about the three main areas where cloud strategies are significant:

  1. Infrastructure as a Service (IaaS); offering unlimited storage and computing power by buying what you need, when you need it
  2. Platform as a Service (PaaS); offering access to platforms on which you can build new services, such as Windows Azure from Microsoft
  3. Software as a Service (SaaS); offering access to tools you can use online and only pay for them as you need them – in contrast to software that needs to be installed and maintained locally in your office or on your computers

As you can see, the cloud-based approach is changing how software is designed and used, how platforms that support other solutions are built, and how the raw computing power and storage itself is managed. All these changes in strategy also affect how the services are paid for – in general companies mostly pay for services only as they are used, not up front by purchasing equipment and office space where it can be used.

But there are two issues that are being raised by this change and that is security and how to manage the various cloud systems a large company might be using.

Cloud security is often featured in the news. When we read reports about millions of customers losing their personal data because a company did not secure their cloud then it’s a concern. This can be even more important in markets like Europe because of legislation such as GDPR that has the power to heavily fine companies that do not secure customer data.

And all these clouds need some kind of central control – a cloud management platform. There are many (CMP) systems out there and each has a different area of focus – which is the right one for your business and will it be flexible enough to change as your business changes?

These are important areas of business strategy that need to be better controlled and defined for the cloud era. If we are already seeing 25% of the IT budget spent on the cloud and yet there are still concerns over security and CMPs then I’m sure there will be some more cloud disasters before long. A cloud strategy is becoming essential, but make sure that as you design your strategy you also take time to think about and plan how to manage these more difficult issues.

Cloud Spending

The Soft Skills You Need To Work in DevOps

IBA Group
Mark Hillary

I have written about DevOps recently on this blog. You can look at my DevOps introduction here, but to summarise the concept, DevOps refers to Development Operations. It is a combined set of software development practices that bring together the development of software with IT operations. The aim is to improve the systems development environment so the software lifecycle can be shorter – it’s bringing software development closer to the business that it serves.

So far it sounds like DevOps is just focused on software development and the environment used to build software systems – it’s all about coding and process. So what are the skills needed to work in DevOps? Logically you might assume that ‘hard’ subjects such as science, technology, engineering, and mathematics (STEM) might dominate, but I read an article recently that turns this expectation upside down.

Tech Beacon magazine listed four key ‘soft’ skills that everyone working in DevOps requires and none of them are focused on STEM skills. They are:

  1. Collaboration and Communication
  2. Empathy
  3. Customer Experience
  4. Problem Solving

Why would these be the essential skills and not coding or process design? Well think for a moment about what I said in the introduction – we are bringing the software development process closer to the business that needs this technology system. So some of the key skills will be focused on that process of getting closer to the sponsoring business people.

Collaboration with people outside the IT team will be essential and the ability to communicate technical problems to non-IT professionals. Empathy implies more listening – especially listening to the people who want the system built for their business. Trying to put yourself in the shoes of the customer so you can improve the customer experience is also an important skill that many people ignore, and the ability to solve problems as they are thrown at you is extremely valuable in any team.

I would argue that these four skills are essential for any DevOps team. If you can find people to join the team with all these skills then it is almost certain they can learn the technical skills you need them to use. If you hire for technical ability only then it will be much harder to create great communicators or problem-solvers through training.

It may seem like the opposite to conventional wisdom, but sometimes the best team members in a technical DevOps team are the least technical.

The Soft Skills You Need To Work in DevOps

IT Outsourcing Is Now At The Heart Of Smart Corporate Culture

IBA Group
Mark Hillary

I have often written on this blog about the nature of outsourcing and how it is changing and evolving – especially in Europe. For several years now I have been exploring how modern delivery methods for software and IT services have been changing – especially the way that enterprise software has followed consumer behavior towards cloud-based services or systems available using a method similar to the App Store.

This isn’t a controversial view, unless you are still defending the traditional method of IT outsourcing, but it is always worth backing up an opinion with research. Scan the pages of the Deloitte 2018 Global Outsourcing Survey and you will see that outsourcing is not only changing because of delivery methods, but also because it is more often being used to drive transformational change.

The business case for IT outsourcing today often depends on how disruptive a project can be. How can we replace the traditional way of delivering a service and completely disrupt the market?

The Deloitte research involves feedback from over 500 executives and 86% of them work in companies with revenue above $1bn a year, so this strategic use of outsourcing is becoming mainstream in large organizations. It is becoming clear that outsourcing is now seen as a highly strategic strategy for a number of reasons:

  1. Skills; It allows access to expertise in emerging technologies such as cloud, RPA, and data analytics, without the client needing to redefine their own skills.
  2. Innovation; Service providers are taking on a new role that is more explicitly about seeing the future – they are expected to not just deliver IT projects, but to offer ideas on how the client can operate in future.
  3. Security; traditional structures never placed data security at the heart of an organization, but modern service providers can introduce these practices.

The use of outsourcing is accelerating. In some industries it is dramatically increasing, such as 39% of Finance managers now working with partners for technology services compared to an expectation that 89% of them will soon be seeking a partner.

One of the most interesting findings from the Deloitte research is that the changing nature of outsourcing is not just about finding a new partner. Most of the time companies are using outsourcing to find a new solution – an entirely new way of working. Often this does also require a new partner, but it doesn’t need to, especially if IT companies, and others offering outsourced services, are proactive and start offering new ideas and solutions to their clients.

At the end of the day the changing nature of the outsourcing relationship is really being driven by innovation. Companies today are finding that the competition they face next year may not even exist today. It is possible for entrepreneurs to have an idea and then to create a global service thanks to the scale offered by cloud and app infrastructure.

Services such as Transferwise, Spotify, or Uber would not be able to function without this kind of IT infrastructure – and the ability to scale up so rapidly is dramatically changing many industries forever. I think it will take a while for some executives to stop thinking about outsourcing as a cost reduction strategy – this is still how the business press largely talks about it. However, as the cloud and RPA become more common and more important across all industries, it should become clear to most management teams that their approach to outsourcing is now strategically important for the future of their business. Transformation is getting disruptive.

IT Outsourcing Is Now At The Heart Of Smart Corporate Culture

Visa Offers New Ways to Enhance CX with Sensory Branding

IBA Group
Vadim Smotryaev
Product Owner

When was the last time you used cash or, to be more precise, had to use cash? Probably, you had to think hard about a business in your daily routine that does not accept NFC payments or bankcards. Five years ago, paying for a coffee with a smart watch seemed like a futuristic fantasy – now it is daily occurrence for millions of people.

The World Payments Report 2018 by Capgemini and BNP Paribas finds that non-cash transaction volumes continued to grow at double-digit growth rates during 2015–2016, reaching 482.6 billion. Non-cash transactions are estimated to accelerate at a compound annual growth rate (CAGR) of 12.7% globally with emerging markets growing at 21.6% from 2016 to 2021.

Visa takes contactless payments to a new level, offering a sensory branding suite to merchants and partners in more than 25 countries. The suite includes sound, animation, and haptic brand cues that occur with a Visa payment transaction. All this helps signify to consumers that a payment has been made using Visa whether in-store or online. The new sensory branding suite gives greater dimensionality to Visa’s brand and lets the customers see, hear, or feel Visa when they pay.

Customer expectations evolve, and with IoT becoming more and more prominent in everyday life, Visa expands its sensory branding suite across the globe, including mobile first and in-stadium experiences, prominent technology, and retail partnership.

One of the examples of such partnerships is the Tap to Phone solution created by Visa and IBA Group. As an authorized Visa solution provider, IBA Group launched Tap to Phone solutions in Ukraine and Belarus. Designed for small and medium-sized enterprises (SMEs), the Tap to Phone technology is an evolution of the mobile acquiring technology, enabling a smartphone with an NFC chip to work as a point of sales (POS). To accept payments, one needs to install a transaction processing application on a smartphone and have a bank account.

The way we pay continues to change, and it is extremely important that customer experience goes hand in hand with the emerging technologies.

IBA Group presents Tap to Phone at Visa Cashless Summit 2019

Rethinking DevOps To Create A More Efficient Delivery Environment

IBA Group
Yuliya Varonina
DevOps Engineer

People often talk of apps as something new – an area of IT development that has only been around since smartphones became commonly used, but that’s not right. At IBA Group we have been developing apps for over 25 years. Before your iPhone, we were developing apps that could run on mainframe systems – we have over 80 teams and over 100 products in this area.

Every year we bring dozens of young people into our development team, often from those we see at the various hackathons we organize. There are still many new ideas for how to improve development inside mainframe culture.

It’s true that mainframes often look like legacy systems. I know that’s how I felt when I approached my first mainframe project three years ago. The infrastructure is quite complicated and the qualifications to work on these systems are quite specialized. It’s not an easy environment, but our teams are enthusiastic and they embrace new ideas.

Some of the key problems developing in the mainframe environment are:

  1. DevOps pain; a lot of manual operations for code building, customization, and setup for various environments.
  2. Development cycle; typically the cycle is between a week to a month – it’s not a rapid development environment.
  3. Version control; we have systems to help, but nothing is integrated with the modern version control systems most development environments use today.
  4. Limited automation.
  5. Poor visibility and control at all stages of development.

If you also work with mainframe development then you might know about these problems already. So what did we do in our own development environment to try addressing these problems?

  1. Automation; we started using the UrbanCode family of tools to start automating some of the infrastructure tasks.
  2. Integration; we integrated the UrbanCode processes with the Rational tools family – RTC Rational Concert and RQM- Rational Quality Management.
  3. Reducing tools; we reduced the number of tools being used so we could focus on using the remaining ones more effectively.
  4. Scalable Pipeline; we built one project using the new DevOps methods and then assisted all teams to develop their projects this way, so these methods scaled across all development teams.
  5. Security; increased automation left gaps in security so we used DevSecOps to embed security functionality.

I can talk in detail about how we did all this, and the benefits we found, but for the sake of this blog it’s better to just highlight the main benefits we found from this approach to DevOps.

  1. Faster deployment; it’s faster to develop new processes and systems therefore your business operations can be more efficient.
  2. System Thinking; building this DevOps environment creates a culture of system thinking which means that responsibility, transparency, and feedback are all improved. Systems Thinking creates a much more focused team that works together.
  3. Increased Effectiveness; IT development is typically full of waste. People are waiting on others to deliver and they cannot work until a specific part of a project is handed over. Managing pipelines makes deliveries more predictable and allows resource to be allocated more effectively.
  4. Better Quality; We now have more tests, more automation, and User Acceptance Testing. We also trust the pipeline. People are used more effectively and this also increases the quality of deliveries.

We know from our own experience that our team now spends 20% less time on unplanned work and reworking problems. This has led to a 40x reduction in systems failure and the team is 50x more satisfied with their work. Even when a failure occurs, we can now recover 20x faster than before.

The list of benefits from this approach is endless. If this blog has sparked your interest in what is possible then please leave a comment here or get in touch with me. I can give you more detail and also personal experiences of going on this transformation journey.

Yuliya delivers her DevOps presentation at SHARE Pittsburgh 2019

A Renewed Focus On Cloud Security

IBA Group
Mark Hillary

Capital One bank in the US was recently targeted by a single hacker who managed to access the personal details of over 100 million customers, despite the bank having all the security you might expect of a large customer-focused organisation. The hacker was a former employee of Amazon Web Services (AWS), which hosted the bank database. They broke in by exploiting a poorly configured firewall, no doubt using some of their inside knowledge.

Once again we are watching as a major brand faces a data disaster. Capital One should be able to absorb the millions of dollars in fines and customer compensation, but for a smaller organisation this type of data breach could be the end. European fines are much higher than those in the US thanks to the European Union GDPR regulation, but why should companies be more focused on this question of data security now?

Because of cloud computing. A recent report in CIO suggested that 96% of companies are now using cloud computing. This means that almost every new database will be in the cloud. Justin Fier, the director of cyberintelligence at Darktrace recently suggested that the general approach to securing networks – mainly with firewalls – has not yet woken up to the fact that everything is now in the cloud.

Network security managers have spent years designing their systems with the concept of what is inside the organisation, what is outside, and how to protect network entry points. Now we are seeing a complete shift away from this structure to the cloud. Companies such as Microsoft and Amazon are offering cloud services that allow their customers to access unlimited storage and computing power.

But this also means that your personal customer data will be outside the organisation and physically located on a service managed by another company. Companies like Amazon have developed a reputation for security and are probably better at securing their systems than any old internal system you previously had, but what happens when a current or former employee goes rogue and hacks into the database they used to manage?

As Justin Fier suggests, there are some new approaches to data management and network security that are essential:

  • Better network oversight; your development and support team can probably create and use new servers instantly meaning that the security team often has no real oversight on the network that is really being used. Give them better tools that allow them to manage what is really out there on the network.
  • Look for malware; Capital One only ever found out about the hack, three months after it happened, because stolen data was seen online. Be proactive and seek out malware and other tricks that hackers will use to break in.
  • Explore Artificial Intelligence (AI); you often can’t prevent an insider launching an attack so create some digital oversight. Use an AI system to monitor all network activity so you can be alerted when any unusual activity takes place – and ensure that nobody can turn off this AI police officer.

The bottom line is that cloud computing offers too many advantages and opportunities for companies to avoid it. With an adoption rate that is now almost universal there is no going back, but we certainly need to consider how best to change and adjust network security for the cloud computing era.

The border or the organisation is no longer the office itself. People and their skills are sourced from suppliers and databases will be located in the cloud. Both people and data now move in and out of the central organisation in a porous way. Protecting this environment is the challenge we face today. Questions about a cloud security strategy should be amongst the first things any executive should be asking any potential IT partner and if the supplier fails to have any intelligent answers then why would you work together?

A Renewed Focus On Cloud Security

Why Should I Be Thinking About DevOps?

IBA Group
Mark Hillary

What is DevOps? If you don’t directly work on the development of IT systems then this might be a strange concept, but for any executive who needs to purchase new technology systems, or modify existing ones, this is an important concept to be aware of.

In short, DevOps refers to Development Operations. It is a combined set of software development practices that bring together the development of software with IT operations. The aim is to create an environment where the systems development lifecycle can be shorter and more features and updates can be performed – in general, it aims to bring IT development much closer to the business that is being served.

The broad goals span the entire lifecycle of a software project:

  • Increased frequency of project deliveries
  • Faster time to market on the original delivery
  • Lower failure rate of delivered software
  • Shorter lead time between business requests and fixes being applied
  • Faster recovery time when failure does occur

By using a DevOps approach the IT team can maximize the predictability of new releases. Efficiency and security are both increased and the code becomes easier to maintain.

IBA Group has conducted research into the effect of using a DevOps approach to software development – although this research was focused on mainframe DevOps. These are some of the key findings:

  • 20 x faster to recover when software fails
  • 22% less time spent on rework to fix problems
  • 30 x more deployments of new software
  • 40 x lower failure rate of delivered software
  • 50 x greater IT team satisfaction

All this data comes from real mainframe client projects at IBA Group. Deployment becomes more reliable and more frequent when people work together using this type of framework. The IT team uses a form of system thinking, which really means that they create a culture of shared responsibility for the project. This culture encourages transparency and shared responsibility – problems that one team member may have hidden in a regular organization are shared and managed together.

Automation is also an important part of the DevOps culture. The aim is to automate all the routine tasks a developer usually needs to manage. This also creates a far more satisfied developer who can focus on the more interesting and challenging parts of the project. This naturally leads to better quality and performance – enhancing the reputation of the team.

Most executives outside IT are not really familiar with software development practices, but it is becoming more important to understand because a different approach to the way that software development is managed leads directly to business effects, such as better quality, fewer failures, and a team with higher satisfaction in their job. DevOps is well established as a practice with a decade of conferences and articles all exploring how it can be used effectively. If you need to purchase any form of software development from an IT company then how they are managing DevOps should be one of the first things you ask.

Why Should I Be Thinking About DevOps?

Outsourcing Has Evolved – Has Your Business?

IBA Group
Mark Hillary

The evolution of outsourcing is fascinating because it has evolved so quickly. Treated initially with some suspicion and largely considered to be a procurement exercise, outsourcing has matured into an important business strategy and most modern companies will work with suppliers to buy in their expertise. But have our attitudes changed?

To many managers, outsourcing is still a subject they avoid. Many have memories of failed projects or comparing suppliers and asking them to all compete over the price. Some aspects of procurement don’t seem at all strategic, but outsourcing has matured right?

According to PA Consulting data, it has. 64% of managers are using outsourcing as a way of driving business transformation – so the introduction of a partner allows them the opportunity to redefine how they do business. 35% of companies that are already outsourcing IT functions are planning to increase the work they give to their IT partner.

But the same study also finds that 69% of managers are using outsourcing to reduce costs. So is it still all about cost reduction or business transformation?

The reality is that working with a partner can now be both. Accenture has argued that we are moving to a business environment they call a “corporate marketplace” where many companies have relationships with many others – there is more of a value web rather than a value chain. This will also include on-demand work platforms, such as UpWork, where experts can be called on for very specific tasks for a short period of time.

It’s clear that outsourcing as a term is becoming dated. The corporate marketplace doesn’t sound much better in my view, but it is clear that companies will be employing a more fluid relationship with employees, individual subject matter experts, and suppliers with specific expertise in future.

Largely this is because of the service complexity. Look at customer service as a classic example of what has changed. A few years back a consultant would analyse your customer interactions and then lift and drop your entire contact centre from your business into a service company – possibly located thousands of kilometres away.

The ambition was largely to save on operating expense and to encourage suppliers to make capital expenditure (basic infrastructure required to deliver the service) in return for a long-term contract.

This approach seems quaint today. Look around at the fast-moving customer service environment and you will see that suppliers today need expertise in Robotic Process Automation, Artificial Intelligence, multiple service channels including social, emerging channels such as smart speakers, and an ability to analyse vast amounts of data in real-time.

None of this implies that a lift and drop solution to the other side of the world would be a satisfactory solution. The supplier today needs to offer deep expertise and an ability to help the client transform their business using the available technologies. They also need to be able to advise on the future – which trends might shape how the client does business next year?

Combine the supplier taking on this role as expert advisor with the more common use of individual subject matter experts and outsourcing looks completely different to how it was 10-15 years ago. It’s time for the business media and managers in general to change their views – for outsourcing in 2019 read partnership and transformation.

Outsourcing Has Evolved - Has Your Business?

What Are The Top Cloud Management Platforms?

IBA Group
Mark Hillary

In my last couple of articles here I explored Cloud Management Platforms (CMP), both to define what they are and to give some ideas on how to choose the best one for a particular business. It was therefore interesting to see that a recent article in ITPro Today explores the top ten CMPs available today, assessing all their strengths and weaknesses.

All CMPs need to provide lifecycle management – this is the ability to track cloud resources over a period of time – and data protection, in addition to the main functionality of controlling and automating cloud-based processes. John Webster, a senior partner and analyst at the Evaluator Group, managed the research published in ITPro Today. One of his main concerns when comparing the different CMPs was that not all of them are keenly focused on data protection – they are focusing mainly on basic functionality.

John explained: “Data protection and disaster recovery is an IT responsibility, a bedrock function, and I think that the vendors in this space have to really start looking at that seriously.” He added: “Vendors will likely provide these capabilities through extensions to data protection and disaster recovery applications that are already available in the market.”

The top ten list of CMPs was created by weighing up several factors, including:

  • The quality of the user interface
  • Ability to manage various groups of users
  • Complexity of the service

Cost control is an important area of functionality that most customers want to use, but many CMP vendors find it very difficult to offer because they are constantly updating their products. The ability to create efficiency is much easier to plan when the software is stable. If the CMP is constantly being improved then there is an almost constant need to explore efficiencies.

John explained that support for Artificial Intelligence (AI) is likely to be an important capability in the near future. He said: “Support for cloud-native including Kubernetes, and application migration will be key functions in cloud management platform tools. AI assistance, or the assistance of artificial intelligence, will become more and more important as time goes on.”

Some of the CMPs on the market at present have been built from trusted and tested systems that were essentially managing IT estates – they have been converted to managing cloud-based systems, but others are built from scratch. It’s important to be aware of this when selecting a supplier. The start-ups might move faster and add more features all the time, but their platforms may be less stable and less tested in real-life situations.

The ten CMPs are not listed with the best in position one; they are just grouped as the ten best CMPs. This is because ultimately the right choice of CMP will be based on the different priorities and needs of each company. Follow the link to the article and you can read the top ten free, providing you submit your contact details to the magazine.

Click here to see the list of the ten best CMPs according to the research by Evaluator Group.

Everest Reports On A Complex And Competitive RPA Market

IBA Group
Mark Hillary

The industry analyst firm Everest Group recently reported their latest research into the development of the Robotic Process Automation (RPA) market and they have some interesting findings. The market is getting larger and more competitive, but the services provided are also increasing in complexity too.

“The market for RPA technology is becoming more complex, with highly competitive and evolving product offerings,” said Sarah Burnett, executive vice president and distinguished analyst at Everest Group. “The Everest Group Products PEAK Matrix is an unbiased assessment that uncovers vendor and product differentiators to identify the leaders in RPA technology based on research that goes deep into the vendors performance and product details, features, functionalities and more.”

The PEAK Matrix reporting method summarises the 22 companies studied along two different axes. The first is vision and quality, describing the ability of the company to successfully deliver the products they promise. The second is the market impact, how is their sales performance and impact on the wider RPA marketplace?

Everest Group classified the main RPA technology vendors into three categories of Leaders, Major Contenders, and Aspirants: 

  • Leaders: Automation Anywhere, Blue Prism, NICE and UiPath
  • Major Contenders: Another Monday, AntWorks, EdgeVerve, HelpSystems, Jacada, Jidoka, Kofax, Kryon, Pegasystems, Servicetrace, Softomotive, Thoughtonomy, and WorkFusion
  • Aspirants: AutomationEdge, Datamatics, Intellibot, Nintex, and Nividous

AntWorks, Automation Anywhere, Datamatics, Softomotive, and UiPath demonstrated the strongest year-over-year movement on both market impact and vision-and-capability dimensions and emerged as 2019 RPA Market Star Performers. WorkFusion scored as high, or higher sometimes, on the vision and quality measurement as the leaders, but it was felt that they have not quite made the market impact to be categorised as a leader – yet.

Other interesting takeaways from the Everest research include:

  • Automation Anywhere, Blue Prism, and UiPath are the top vendors in terms of RPA license revenue, closely followed by NICE. Pegasystems leads in terms of revenue from attended RPA (RDA) licenses
  • Softomotive has the highest number of RPA clients in the market, most of which are small-sized enterprises and SMBs. Witnessing almost 300% year-over-year growth in its number of clients, UiPath holds the second spot in terms of number of RPA clients
  • Automation Anywhere leads in North America, which is the largest RPA market, and Latin America. Blue Prism leads in the UK and MEA markets, while UiPath leads in Continental Europe and Asia Pacific
  • UiPath holds the highest market share by license revenue across horizontal functions such as F&A, procurement, and HR, while Blue Prism leads in banking and insurance industry-specific process areas. Pegasystems has the highest market share in the contact center space
  • Leaders have moved away from perpetual licensing to annual/monthly subscription-based licensing models. Advances in RPA technologies and increasing client maturity are fuelling the rise of more output-oriented pricing models such as flexible usage-based (e.g., per minute/hour) and per-process or transaction-based models
  • RPA solutions continue to evolve with a host of capabilities, such as computer vision, workflow orchestration, intelligent workload balancing, auto-scalability, and predictive SLA monitoring, to help enterprises achieve strategic business outcomes
  • Attended RPA / RDA continues to witness increased demand in the market. AI-based next-best-action recommendation and interactive UI for on-screen agent guidance, which enhance worker productivity and help improve customer experience, are among the key RDA differentiators

This new research from Everest Group is comprehensive and insightful and was only just published in July 2019 so the information is current. I recommend browsing their findings because it is clear that RPA has moved beyond the hype cycle now and is becoming a serious strategy for many company executives.

Everest Reports On A Complex And Competitive RPA Market

How Do I Choose A Cloud Management Platform?

IBA Group
Mark Hillary

I introduced the concept of a Cloud Management Platform (CMP) in my last article here on the IBA Group blog and closed by saying that it’s a complex process to choose a specific platform. However, that’s exactly the decision that many managers are exploring right now and it can be even more complex when you need to buy a cloud, but it is your service provider that will actually be using it.

This white paper from the IT research firm Neovise gives some excellent advice on this particular issue. The white paper introduces the need for CMPs, as I did in my last article, but then it lists some specific questions you need to ask when determining the best system to use:

1. Business Requirements:

• What customers do you plan to serve, and what are their requirements? How well will the cloud platform serve them?

• How much additional work is required for installation/configuration? Integration? Adding missing features?

• How quickly will the cloud platform let you get to market and start generating revenue?

2. Product Requirements:

• Does the platform enable the right compute, network and storage capabilities?

• Are there specific hardware requirements for the platform? Or can you choose hardware from any vendor? Can you leverage existing hardware investments?

• How extensible is the product? Does it support federation with other providers?

Does the platform allow you to seamlessly integrate new cloud services with your existing hosting services?

3. Support Requirements:

• Will it require significant resources and expertise to deploy, customize and operate the platform?

• What do you do if you need help deploying or troubleshooting the platform? Is there customer support? Or just community support? Both?

Does the platform receive ongoing enhancements? Are new versions difficult or disruptive to install?

These are quite detailed questions and there are different types of CMP as I outlined in my earlier article, but if you go into these questions with a clear outline of your specific capabilities, resource, timeline, and strategy then you can achieve a successful outcome. It’s recommended to include this information on any RFI or RFP process when selecting a supplier so you can work with a supplier that supports your preferences on CMP in addition to just agreeing on a cloud strategy.

Where you have not already deployed a cloud or CMP then it would be preferable to outline your preferences and needs right from the RFI – this way a potential partner can advise on the best cloud to use and the best tools to manage it.

In the early days of cloud adoption, this was all easier. A client would just make requests directly to their supplier if more capacity or storage is needed, but as IT infrastructure has become more complex, and usually involves a mix of cloud and on premises equipment, it is essential to make the right choices about the system you use to manage your cloud – and manage it in a way that works with the needs of your supplier.

Choosing a Cloud Management Platform is a complex process

IBA Group Celebrates Team Spirit at Tourist Rally

From June 21 to June 23 this year, the 16th IBA Group’s tourist rally took place 70 km away from Minsk. Over 1,000 people got together to enjoy nature and sports. The participants of the rally ranged in age from several months to their 70s. The organizers of the event did their best to meet the needs of such a versatile audience.

IBA Group’s tourist rally is an annual tradition for the IBA Group’s community. It is a time when IBA Group’s employees, their families, and friends gathered to celebrate team spirit, endurance, nature, and sports. It is more than just an annual celebration. It is a spotlight on what these values mean to each IBA Group’s employee.

The expectations from the event were far from being optimistic as the weather forecast promised heavy rains and windstorm. That was the first challenge to overcome for most of the participants. They got through the heavy rain on the way to the venue of the rally and were rewarded by fantastic weather on the days to follow.

The IBA tourist rally has longstanding traditions that developed together with IBA Group. What started as an 87-people event in 2005, grew into a massive 1,000+ participant rally. This year, the organizers welcomed a new participating team Lemmingi that defied such veterans as Pertsy, Dynamit, Dobry Vecher, Belki, and Smarch Cats. The bike biathlon was added to the list of competition categories. Other team categories included obstacle racing, mud racing, volleyball, badminton, rock climbing, draniki (potato pancakes) contest, and some individual disciplines such as ropes course and darts.

On Friday evening, to warm up and to dry up the atmosphere, a culinary draniki competition was held under the guidance of an experienced chef. All participants and viewers\tasters enjoyed every minute and bite of the contest. The Mammoth team won the first place.

The opening ceremony of the tourist rally took place at 09.30 am on Saturday. At the opening ceremony, Sergei Levteev, IBA Group Chairman, made a welcoming speech. According to the tradition, Alexei Tereschuk, the captain of Smart Cats, the last year’s winning team, opened the rally by raising the flag.

The rally program was diverse and manifold. Each participant found an occupation to his or her taste, be it an athlete, a fan, a guest or a child. The program of workshops was varied. One could opt for an individual pottery class, take part in a tea ceremony, master a new style of calligraphy, make a nesting house, or compete in funny duo contests. The strongest could go up against Vyacheslav Khoroneko, the six-time record holder of the English Guinness World Records, and a repeated world champion and a record holder in free weight lifting.

Other activities enjoyed by the rally participants included a ride on a trolley, a walk on the rope course, a wall climbing, and a ride in a BRDM-2 military reconnaissance vehicle. Everyone received a charge of positive emotions during a Fun Starts relay race. Wellness lovers could indulge themselves in saunas, bathe in a nearby lake, and get a Hawaii-like suntan.

Our top management at this year's tourist rally

On Saturday, all participants and guests enjoyed breakfast and lunch.

Throughout the day, the kids danced and played with animators, took rides on catamarans and canoes, played in an inflatable castle, rode a merry-go-round, and watched cartoons in the karaoke tent.

In the evening and throughout the night, the teams had wonderful bonfire parties accompanied by shashlik and guitar songs.  Everyone could unwind at the disco and demonstrate their vocal abilities in the karaoke tent. The Saturday night also hosted Bez Bileta, a well-known Belarusian group, and the DeTroit cover band. That was one of many moments when members of competing teams celebrated the company’s spirit and unity.

This feeling of being a part of IBA Group’s global family was somehow complemented by the atmosphere of solidarity inside each team. Most of the teams comprised colleagues of the same department. The tour rally for them is another chance to strengthen the team spirit and extend the relationships beyond the working environment.

Each team had a motto, a flag, and a designated territory. The teams’ captains did their very best to win the contests and to feed the teams. Each member of the team was assigned a task, the teams‘ cooks used their creativity and experience to appeal to the taste of each team’s member.

As the tradition has it, there are no ex-employees at IBA. Many retired and ex-IBAers come to the rally to feel themselves a part of the IBA Group’s family. They are the tradition keepers of the rally. On the other hand, we saw many new faces this year who have brought enthusiasm and changes to the company’s life. Tradition and innovation, youth and experience is what makes us so similar and so unique.  

The nine teams competed in the multiple categories, including obstacle racing, mud racing, water relay race, rowing slalom, volleyball, bike biathlon, rock climbing, and badminton. Rope course, darts, and weights lifting were available for individual competition.

The award ceremony took place on Sunday morning. Winning teams, Lemmingi, Dynamite, and SmartCats and prizewinners in individual categories received medals and prizes. Gennady Makeev, HR Director, made a closing speech where he summed up the results of the rally and thanked the organizers and the participants.

You can get a glimpse inside the 16th IBA tourist rally by visiting our Facebook and Instagram.

IBA Group's tourist rally 2019

What is a Cloud Management Platform?

IBA Group
Mark Hillary

Most people working in IT today know about the cloud and how cloud-based systems can offer immediate access to storage or computing power easily. Many companies now use a cloud strategy to ensure they can ramp up and down on available systems or storage – it’s a common theme of discussion when planning an IT strategy.

But what is a Cloud Management Platform and in particular how can one be useful to service providers offering IT services to their clients?

Cloud Management Platform (CMP) is a term coined by the industry analyst Gartner. The analyst firm wanted a way to describe products or tools that help companies to optimize and manage their cloud infrastructure for cost, security, and operations. A good CMP strategy should allow users to maintain control over dynamic and scalable cloud environments.

So it is really just a control system that allows the user to maintain dynamic control over their cloud system. Instead of frantically calling a service provider and asking them to quickly scale up storage capacity in their contracted cloud, the customer should be able to use a CMP to just scale their cloud – it should be as easy as sliding a bar on a control panel.

The major CMPs on the market today will all offer these different aspects of cloud management:

• Cost
• Security
• Performance

These are the three main areas where the customer can manage their cloud and cover more specific areas such as budgeting, rightsizing the cloud, compliance and monitoring, and creating alerts and other analytics.

CMPs are still quite new tools and so there are still different types of tool available. Some are very focused on specific issues, such as controlling security risk or optimising costs. Some allow the option to manage multiple clouds, so if you are using both Microsoft Azure and Amazon AWS then you can manage both clouds from a single tool. Some CMPs even allow the tool to manage a cloud and systems you have on site simultaneously.

Clearly this is an emerging area. There are many new tools solving problems that have only recently become apparent. It was only a few years back that companies started taking cloud-based services seriously and it has become clear that it can be difficult to control the various aspects of a cloud-based system – such as cost control and security. In addition, if a service provider is delivering services on a cloud that is owned and managed by the client then the client needs an easy way to manage areas such as security in partnership with their service provider.

CMPs are still new and it is therefore difficult to advise on which one is perfect for each client, but there is a clear need to work on CMP selection with your service provider as any chosen solution must work for both client and service provider. I’ll explore this question next time here on the IBA Group blog.

You can also read about the IBA Cloud Platform

cloud management platform

How Is RPA Innovation Taking Shape?

IBA Group
Mark Hillary

I have written in the past about how impressed I was when I visited IBA Group in person. They don’t really shout about it, but their Robotic Process Automation (RPA) team has a level of expertise that I was not expecting to find when I visited Minsk last December. I keep recalling this when I see some of the analyst and media coverage of RPA online because there is still a strange mixture of anticipation and hype in most of the analysis.

HfS Research has been one of the main critics of the hype around RPA. They have consistently called on other analysts to provide realistic market projections and to stop using the ‘robots are taking over’ myths that have grown in frequency in the past two years. Their Horse For Sources blog in particular has been scathing when individual analysts have made RPA claims that just cannot be supported by evidence or case studies.

HfS has documented that they believe the ‘big 3’ RPA companies – Automation Anywhere, Blue Prism, and UiPath – are creating a baseline for the entire industry. It’s messy out there because there are loads of companies that are trying to get a piece of the RPA hype and yet not everyone can succeed – not least because each system needs experienced people who can implement and use it. I believe that WorkFusion should also be on this list as they are not only creating a baseline, but they are disrupting the marketing by even offering basic RPA services free.

On March 3rd the Horses For Sources blog said this: “However, beyond scripts and bots and dreams of digital workers scaling up rapidly to provide reams of value, most enterprises are fast coming to the realization that they need an actual process automation platform capability that ingests their data, visualizes it, machine learns it, contextualizes it and finally automates it. ”

The blog goes on to say: “The implication is that for many companies the dream is over. They thought that RPA would work easily and yet they have found that it’s actually quite complex to integrate into their main business processes. You cannot just point an RPA system at a business and say ‘automate that’ in the same way that computer software doesn’t write itself – someone needs to understand how to code so the computer understands what you need.”

Go and follow the link above if you want to read the conclusion to what they think will happen next, although the short story is that they believe that there may be a new phase of RPA led by AntWorks with a more integrated approach to automation. In a way, we are seeing the process of automation becoming more automated.

I think there are two conclusions that can be drawn from what we are seeing in the RPA market at present. First is that most implementations are started to coalesce around the three top system suppliers and that’s a good thing because the market cannot tolerate the fragmentation that dozens of small systems creates. Second, the RPA story is not over yet. It remains quite difficult to implement and anyone making this process easier could well lead the next chapter in the RPA story.

Digital Twins Will Aid Digital Transformation In 2020

IBA Group
Mark Hillary

I wrote recently on this blog about the use of SAP to create Digital Twins, a digital representation of a real system so it can be more easily monitored and controlled. As I mentioned earlier, this has been a common practice in aviation for several years. Engine manufacturers will always maintain a digital version of every engine they sell and ensure that the digital version is updated in real-time using sensors on the real engine.

In aviation the advantage of doing this is obvious – it allows for more efficient maintenance and safety procedures when the digital engine allows engineers to monitor real engines remotely. But I believe that the launch of SAP’s Leonardo system last year will really start accelerating the use of Digital Twins as a common business strategy.

It is a combination of technologies and strategies that are creating this possibility, but the three important ones are:

1. The Internet of Things (IoT); as the real world fills with sensors and connectivity as standard for almost every electronic device we will reach a point where systems such as a Digital Twin are essential just to stay on top of what is connected and what information it is reporting. In the home, this may only be devices such as a Kindle, iPad, Echo, phone, lightbulbs, or heating thermostat, but in the industrial environment it can easily be more complex and difficult to control.

2. Artificial Intelligence (AI); with so much data being created constantly by sensors we will need to apply AI principles to the data just to make sense of it all. For example, if your home thermostat detects patterns in the way that you prefer your home to be heated then it should be able to anticipate what you want before you change the settings.

3. Machine Learning (ML); the ability to look at every action and outcome by every sensor inside a network will allow the system to learn about the ideal outcomes and then to suggest recommendations in future based on earlier learning.

It is really the IoT that is at the heart of this development. Imagine the complexity of a modern industrial facility – a large brewery or car factory for example. Across the entire property will be doors, windows, pumps, and various robots that all need to be coordinated. Most companies with these facilities will already have some sort of control mechanism, but the Digital Twin makes an assumption that every component (pump, door, assembly-line robot) has in-built sensors. By taking a feed from all of these sensors we can build a complete virtual mirror of the plant.

The IoT facilitates this by ensuring that the real-time sensor data is available, then the AI system goes to work on spotting potential problems or just process flows that are unusual and alerting workers to places they need to check.

I have seen this type of system deployed for an office management system where every light, heater, door, and window is modelled in the system. I believe that we will see the Digital Twin concept growing much faster as companies find that they can create enormous efficiencies by improving what they do and spotting problems before they happen.

As Forbes magazine recently suggested, it will soon be impossible to plan any kind of digital transformation for your business without creating a digital twin first. The processes will be simply too complex for any one manager to understand from start to end. Not only do you need to map out all the existing components, you need to apply AI to oversee how the entire system is working.

Without these deep insights into the way your business functions at present any transformation plans will be impossible. Digital Twins are not just for those obsessed with being able to manage their existing IoT infrastructure, they are becoming an essential tool for managers who want to see how the future of their business might look.

Digital twins will aid digital transformation in 2020

How Is Artificial Intelligence Developing In The Enterprise?

IBA Group
Mark Hillary

Artificial Intelligence (AI) has moved from science fiction into the enterprise in recent years. Many companies are using AI systems today to intelligently analyse large volumes of changing data and to notice or predict patterns. Typical business uses today include examples such as:

  • Rail operators predicting train delays before they happen because the AI system can extrapolate from small delays to predict the impact on the entire network.
  • Customer service agents being advised on how to help customers by systems that know the answer to every question a customer has asked in the past.
  • Alexa knows how to answer your question because it immediately processes your voice and determines what you are asking before creating an answer.
  • Netflix knows which movie you might want to watch because they know your past behaviour and how similar customers have also behaved.

AI really is all around us today, in the enterprise and as consumers of services. In the present environment it would now be unusual for any company to not be exploring how AI can improve their business.

But AI does have one a fundamental flaw, it is always limited to working on a very specific problem. This means that you can have a very complex system that knows everything that your customer may ask when they call for help or the system may understand how to play chess or Go, but these individual tasks are all that it can do. There is no inherent awareness of the environment around the system – although we use the term intelligence, it’s not really aware or sentient. An AI system that can play Go cannot plan the best route on a map.

This means that the system can only solve the problems it was designed for. Some might argue that this is a benefit, because it means that however good our AI systems get, they never move into the realm of awareness and all the problems that a conscious system might create.

A recent experiment by IBM has demonstrated that AI is developing rapidly though. They demonstrated how an AI system could be asked a random question and it would then have the ability to debate that subject. For example, in the video clip that I watched the system was asked if pre-school facilities should be subsidised by the government. It gave a response, arguing why subsidies are useful for 4 minutes.

This system has been pre-loaded with information on millions of subjects and objects. It’s stuffed full of encyclopedia content and research. But even with all this data it is quite an achievement to turn that into information and then a coherent argument.

Essentially this system is starting to show that perhaps an Artificial General Intelligence might be possible. It would need to be pre-loaded with an enormous amount of general data, and then would need a Machine Learning system to continue learning, but it is looking more feasible than even a year or two ago when Elon Musk started warning that we are heading for an ‘AI apocalypse’ because the machines will eventually have more intelligence than the humans.

I don’t think we will be seeing many business case studies featuring general intelligence just yet, but AI in the form we already know it will certainly be more important. AI is offering companies a chance to identify patterns and trends they could never see manually and this will be a strong source of competitive advantage in the next few years.

Artificial Intelligence (AI) has moved from science fiction into the enterprise in recent years

How Do You Understand What Customers Will Want In The Future?

IBA Group
Mark Hillary

It was great to read the article published on the IBA Group website about the CXOutsourcers Mindshare event in Windsor, UK. This was a very interesting event hosted by Peter Ryan and Mark Angus connecting together the service providers from the BPO (Business Process Outsourcing) and CX (Customer Experience) industry.

The IBA team was there at the event because their expertise in areas such as Cloud Computing and Robotic Process Automation is highly in demand from the CX companies – hopefully they managed to strike a few new partnerships!

As mentioned in the IBA article, I was speaking at the event about the future of the customer experience – how can you profile and understand the customer of the future?

What I tried to do with this talk was to initially frame expectations. It is easy for people to make wild predictions about how customers will behave in future, but what they often forget is that social and technological progress is not always gradual. Sometimes an invention or innovation can completely change the way that people behave.

A good way to think of this is by considering how railways changed society. Before railways people were forced to live within walking distance of their workplace. Railways created the freedom for people to travel to work and this in turn created the concept of the suburb.

We have seen a similar change in the past decade. Since the launch of the Apple iPhone in 2007 and the subsequent explosion in the use of social networks, the way that people communicate with each other has dramatically changed. This has led to a radical change in the way that people communicate with brands and an evolution in the way that the customer journey works – this is the journey from first hearing of a product to learning more and then eventually buying it.

That customer journey used to be quite simple and was focused on advertising or marketing to create awareness and then a sales process followed by customer support. Now we can see brands that are not building customer service contact centres, they are building customer experience hubs. They are using a mix of human and digital technologies and building an ongoing relationship with customers that can last for half a century or more.

What is so interesting about the present day business environment is that there is so much potential for dramatic change in so many ways. A retailer planning strategy in the era of my parents would only ever be planning new store locations and sales promotions – nothing in the future was dramatically different to the past.

Look at the retail environment today. Not only is online retail creating a new era of competition, but the way that town centres are featuring retail is changing. Other huge factors may also change how society interacts with business, such as climate change, geopolitics and the dramatic rise of China, the creation of social inequalities, and the preference to rent experiences rather than owning products.

You can click the link to read through my slides for some more of the ideas I presented at CXOutsourcers, but I think that what we will see more often today is emerging business models and services driven by the online economy and the desire of the customer for greater convenience. Go-Jek in Indonesia started out as a ride-hailing service with motorbikes – like Uber with two wheels. They expanded into offering services such as medicine or food delivery by leveraging their network of riders and eventually they created such a wide array of services that they introduced their own in-app payment system. They now process more payments than any major credit card brand… they are now a financial service brand and they started out offering rides on scooters.

How might this happen in your industry? Think about it, your competition in 2020 may not even exist today or they may be working in a completely different industry. Now that’s scary because it means that we are moving faster than ever to stay ahead of business trends, but we are never going to be going this slow again in future.

Mark Hillary for CxOutsourcers
(c) Mark Hillary

SAP Is Redefining How ERP Can Create New Business Solutions

IBA Group
Mark Hillary

I visited IBA Group just a few months ago and one of the most interesting interactions I had during my visit was with Dmitry Konevtsev, the SAP Department Director. In my own experience, attitudes to Enterprise Resource Planning (ERP) systems such as SAP have changed dramatically in the past few years. Systems that were once essential and heavily invested in have proven to be failures for the business. The reputation of ERP has been in decline, but what are real companies doing with ERP today?

I asked Dmitry for his views on how businesses are approaching ERP today. He said: “The implementation of ERP has been a matter of many discussions since the early days. People often feel uncomfortable about any changes to their business and ERP can often overhaul the operation of the entire enterprise. Therefore, employees are opposed to ERP, but the key is to approach it as a business task, not an IT implementation.”

Dmitry explained a recent project that IBA has been involved in implementing. He said: “We initiated the introduction of the newest SAP Profitability and Performance Management 3.0 technology at a major mobile operator. The customer aimed to develop a self-service analytics system that provides a comprehensive insight into the overall company performance. The project is one of the first SAP Performance Management for Financial Services (FS-PER) implementations in the world. The analytical solution is complicated, as it is designed to integrate and harmonize numerous heterogeneous data sources and involves millions of measurement units.”

IBA Group's  solution was selected as a winner of the European IT & Software Excellence Awards in the category Big Data, IoT or Analytics Solution of the Year in 2019He added: “The solution integrates numerous heterogeneous data from various sources (legacy systems) and is of high complexity, involving 40 million measurement units that result in billions of allocated data records. Our solution was selected as a winner of the European IT & Software Excellence Awards in the category Big Data, IoT or Analytics Solution of the Year in 2019.”

Dmitry explained how IBA has experience across many different industries, including oil and gas, railways, and telecommunications. ERP is an important tool across all these industries and although the media image of ERP installations has been largely negative in recent years, the reality is that many companies still require ERP to manage complex logistics and supply chains.

The idea of a digital twin is a concept that many industries are growing, especially as Internet of Things (IoT) sensors become more common. The digital twin has been common in the aerospace industry for many years – an engine manufacturer such as Rolls Royce will maintain a digital version of every engine they deliver and it is updated in real-time from the real engine using a stream of data from sensors, capturing a complete digital twin of the real engine.

Tools such as SAP can deliver this concept for other solutions. Dmitry showed me an example of a building management system. Now I am not an expert on managing buildings, but I can imagine the complexity of managing every single power socket, lightbulb, fire extinguisher, window, and door in a large building. It used to require pages of maps and floor layouts in addition to constant inspections. Now the entire building can be coded into a graphical interface so the user can see everything in 3D on screen and sensors all over the building feed data into the SAP model. A fire extinguisher can send an alert if it has been used and is now empty. A power socket causing a circuit to short out can be identified the moment there is a problem. The model I saw looked stunning because the graphics resembled a video game – maintaining a building like this would clearly be far more efficient than the old system of only fixing problems after they happen. With real-time monitoring and sensors, some problems can be predicted before they happen – and then prevented.

IBA is proud to be a long–term partner of SAPDmitry explained: “Digital twins are a development of the IoT concept. Earlier, IoT was viewed as a technology of passive sensors that receive and send data. Today, it is a concept of smart sensors that analyze data and make decisions locally based on the data they process. In 2018, Gartner suggested that this is a technology stuck in the hype cycle, but we can see real clients asking for projects like this using SAP Leonardo.”

It is clear that ERP has evolved. The integration of IoT is allowing ERP to not only represent process flows, but also to predict and make changes. We are now in an age of intelligent ERP and the failed installations of the past can now stay in the past. Solutions like digital twins using SAP are demonstrating that there is a bright new future for companies ready to explore how ERP can help them to redefine and manage their business processes.

For more information on the SAP team at IBA Group click here.

How Technology Companies Are Delivering A Wave Of CX Innovation

IBA Group
Mark Hillary

It’s always great to see when the new issue of Intelligent Sourcing arrives. It’s updated all the time online, but there is still something nice about seeing a collection of news bound together in a real magazine. I know that’s old fashioned, but a quarterly business journal is like a collection of thoughts from that time.

The issue is focused on innovation and I contributed a column that you can read if you click the later link to the magazine. Although my article was a focus on Customer Experience (CX) innovation, as I read it again I noticed that so many of the specific innovations I was documenting require technology expertise.

This is quite a change from the days when customer experience was called customer service and involved nothing more than a contact centre full of phones. Handling interactions between customers and brands today is highly complex and operates across a number of channels. Here are some of the key areas shaping CX innovation today as outlined in the Intelligent Sourcing feature:

Customer expectations and journey; interacting with customers today takes place 24/7 across many different channels (including social) and involves thinking about a 50-year ongoing relationship with the customer, not just managing a single phone call.

Technology; almost every emerging technology you can think of is being applied to the customer relationship. To list just a few – Artificial Intelligence, Machine Learning, Augmented Reality, Virtual Reality, Robotics, location awareness, cloud computing, the app store. All these technologies are being shaped and influenced by the way that brands are using them to interact with real customers.

Automation; Robotic Process Automation (RPA) is being extensively deployed as a tool to make customer service agents more efficient. The blend between digital service and human service is now one of the most important areas of research in this field.

Customer-centricity; new companies can design their entire service around what customers want in an age of smartphones. There is no need to copy how a bank or insurance company operates – especially if they designed their processes many years ago. This is having an enormous impact on traditional brands that are being challenged by brand new companies that can deliver services better.

CX and Business Process Outsourcing is often presented as an entirely separate type of business that is unconnected to what IT service companies are doing, but I believe that most of the innovation taking place in CX is being driven by IT. In fact, many of the IT experts are now becoming experts in areas such as RPA and that means they are rapidly becoming CX experts. The market for technology services is changing and CX innovation is creating many of these new opportunities.

Spring issue of Intelligent Sourcing

CX and technology

Connecting RPA to Create An Automated Enterprise

IBA Group
Mark Hillary

How does Robotic Process Automation (RPA) really change the enterprise? Naturally, there is a need to seek out the expertise of companies (like IBA Group!) that have expertise in all the main software systems – because it pays to bring in experts when delivering a completely new system, but what about the wider changes that continue after the implementation?

I think this is an interesting question. RPA has the potential to fundamentally change the way that workflows are organised inside many companies and yet I rarely see this discussed in most of the coverage. In fact, most RPA media coverage can be summarised as focused on the size of the market and how automation can replace employees.

Of course, the market size is important, and the potential for automation to replace people is also important – and scary. Many people are getting worried by media headlines suggesting that the robots are about to take their jobs.

Most of this reporting is irresponsible and doesn’t reflect how companies are really exploring the use of automated systems such as RPA. The current debate around RPA reminds me of the lump of labour fallacy. This was suggested by some economists who believed that the amount of work in the economy is fixed, so if you restrict the hours that employees can work then you will reduce unemployment. Work isn’t so simple and the amount of labour is certainly not fixed.

Why is RPA becoming so popular? There are demonstrable benefits that can be attributed to RPA projects. Some of the clear business benefits that can deliver a Return on Investment (ROI) include:

  1. Faster time to market: products and services can be delivered faster when a part of the value chain has been automated, allowing quicker delivery and an improved time to market for new ideas.
  2. Productivity boost: more can be achieved with fewer resources, so the same team can boost what they were delivering before automation.
  3. FTE requirements: if a significant part of your business processes can be automated then logically the number of team members required to process this information can be reduced.

These are the initial short-term benefits. Naturally, the immediate benefits of an automation project will be that the processes work faster, allowing the same team to be more productive, but there are some additional longer-term benefits that should be considered beyond the initial boost.

First is the ability to transform your business. Many industries are experiencing a wave of rapid change at present. Change really is the only constant for almost every traditional business model. Look at banks becoming apps, or news publishers searching for a revenue stream. Many traditional industries are finding that they need to change in order to survive in a very different business environment. If a significant part of your business can be automated then this facilitates innovation in the rest of your processes. It could even be argued that a significant digital transformation project will never succeed if you cannot automate the repetitive processes in your value chain.

Streamlining the processes you have yet to automate is another significant advantage – expanding the scope of automation beyond what you can initially achieve. Once you can see just how much of your business can be automated, there is a strong temptation to increase the processes your business manages using RPA. It’s important to create a period of stability once RPA is initially rolled out, but after that expansion should be encouraged.

Automating many of your systems allows governance checks to be applied automatically by the system and all processes and actions to be recorded. This can help with compliance and governance by removing the opportunity for manual errors and ensuring that a comprehensive audit trail exists for all automated actions.

It won’t be easy to see all the potential benefits from RPA immediately because there are just so many software vendors and no single control mechanism. We are still watching the growth of a market that will transform how companies operate far more than ERP or CRM ever did. But questions remain about how to link RPA into other systems within the enterprise – the Internet of Things for example. Instead of building these links differently for every organisation why are we not building RPA systems like USB cables – able to just plug in anywhere?

RPA will fundamentally change how enterprises are designed in the next decade, but some important decisions will need to be taken along the way. Not every RPA vendor will survive, just as Betamax video was killed off by the market acceptance of VHS. It’s going to get interesting out there as more companies rely on automation to compete.

What Can We Really Expect From RPA in 2019?

IBA Group
Mark Hillary

I wrote recently on this blog about my surprise at how sophisticated the RPA solutions offered by IBA Group already are. As I mentioned, they are offering solutions that are far from the typical vapourware offered by some IT specialists – they already have genuine case studies using all the major RPA platforms.

So to take this theme forward a little, how is the RPA market doing? You will have seen many bullish predictions, but a couple of reports from Forrester Research and RPA software specialist UiPath caught my eye.

Forrester makes some bold claims. Principal Analyst JP Gownder said: “Automation will be central to the next phase of digital transformation, driving new levels of customer value such as faster delivery of products, higher quality and dependability, deeper personalization, and greater convenience.”

That’s strong support for RPA in 2019, but Forrester also notes that we are now reaching a tipping point for automation. The Forrester argument suggests that we will now start expecting professional employees to be augmented by automation. This alters the workforce and drives companies to focus on customer value.

In fact, this is a common theme in talks that I have given on automation. I believe we will see this being a much more pervasive change in the way that professionals work and are hired for their jobs. At present we still see RPA as a function of the technology department, but soon we will see job adverts on LinkedIn for HR professionals, credit analysts, and accountants all asking that the applicants have relevant RPA coding experience – that’s going to be quite a change.

In their predictions for 2019, RPA software supplier UiPath made a few more interesting predictions:

·       Government adoption will soar; governments always need to do more than less so they will be quick adopters of RPA.
·       Less focus on headcount reduction; the focus for RPA will shift from saving cash to improving employee engagement and how employees work.
·       Death of BPO; controversially they also predict the end of Business Process Outsourcing because RPA tools allow internal teams to create their own efficiencies.
·       RPA blended with AI; put them together and you can create a wave of new intelligence – they each help each other.
·       Growth will be bigger than you expect; despite many analysts predicting strong growth for RPA, the team at UiPath says it will be bigger than you expect in 2019.

I think that both UiPath and Forrester have some interesting insights here and I tend to agree with them both. I believe we really are at a tipping point and the effect will go far beyond the technology team. The adoption of RPA will affect just about every professional employee and will demand that they start adopting new skills and methods.

This is why UiPath can speak with such confidence. If the analysts are still focused on the ability of RPA to redefine processes that are defined by the CIO then of course there will be growth, but if we start to see every single business process being redefined and automated then the current growth projections will be nowhere near large enough. Let’s see what 2019 has in store for us all!

How Will Platforms Develop in 2019 – Especially Cloud?

IBA Group
Mark Hillary

One of the biggest changes in recent years in the IT market has been the use of platforms to deliver solutions. Software development may have moved on from the old waterfall style developments to agile delivery models, but many projects still required analysts to design solutions, developers to build them, and engineers to then deliver the product.

Consider just these three types of platform and the change becomes obvious:

1. Cloud; the ability to use a system remotely without local hardware or storage requirements and usually charged on a pay-as-you-use basis.

2. App Store; developing systems for Android or iOS and releasing them to the App Store allows software to be globally distributed instantly.

3. Social and mobile; the success of games such as Farmville or Mafia Wars has largely been because they were designed to be social – play is integrated into social networks such as Facebook so friends can actively engage with your game or compete with you.

These are all major changes to the way that software is designed and released. Not least, one of the biggest changes because of these platform adjustments is that business line heads – rather than the CIO – will make decisions on new IT systems. If the IT tools can be delivered without affecting the infrastructure of the client company then this is much more likely.

In particular, the cloud is changing how enterprise systems are delivered. Business users can pay for systems on a subscription basis without the need to plan for infrastructure and this is radically transforming how many companies see their use of IT and supportive software systems, such as ERP or CRM.

But even the cloud is undergoing rapid change. Many developers are now suggesting that there are some key trends to look for in the cloud market in 2019, such as:

1. Multicloud; AWS, Google, and Microsoft are the cloud giants and all have their strengths, but many companies are now exploring how to reduce their reliance on a single cloud supplier – for security, resilience, and to gain advantages from the strengths of each one.

2. Migration; it will be more common to migrate across different clouds to take advantage of deals or strengths from specific suppliers. New tools will make this much easier.

3. Governance; many companies still have fairly weak rules governing their cloud use and this will get stronger in 2019.

These are interesting points and worth noting. It has never been a good strategy to get locked into a single IT supplier and now we are seeing the same caution with cloud suppliers. 2019 will not only be a year of opportunity for various platforms, it will be a year where we start auditing how these platforms are being used.

IBA Hosts Rocket.Build Local Hackathon

From January 24 to January 25, IBA Group hosted the Rocket.Build Local 2019 – Minsk, Belarus hackathon. For the first time, the annual event took place at the IBA Group’s High-Tech Park campus. Rocket.Build is a Rocket Software’s annual hackathon. This event brings together engineers and programmers from around the world to work in teams to develop new products that help Rocket customers solve their business and technology challenges.

IBA has been cooperating with Rocket Software since 2016, the primary area of cooperation being mainframe products. Rocket develops products in diverse fields, including analytics, networks, data, storage, and enterprise software. The firm’s products are designed to run on mainframes, Linux/Unix/Windows, IBM i, cloud, and hybrid/virtualized systems.

IBA Group specialists who are involved in Rocket Software projects took part in the hackathon. All 90 participants were divided into teams, each team consisting of two to four members. The teams came up with 26 various ideas to solve production issues.

Having finished working on projects, the participants presented their ideas to the customer. In a strictly limited three-minute presentation, the teams had to convey the value of the proposed solution to the audience and the customer, and to demonstrate its functionality.

Anjali Arora, Chief Product Officer at Rocket Software, chose the winner of the main prize, the CPO Award. The winning team is expected to travel to Boston to participate in Rocket Build Global to be held from June 9 to June 13 at Rocket Global Headquarters. In addition, a peer voting was held at Rocket.Build Local 2019  and three teams were selected as winners.

The hackathon was a delightful event for IBA Group employees, giving them an opportunity to demonstrate their skills in a friendly and innovative atmosphere.

IBA Group hosts Rocket.Build Local hackathon

Using The Cloud To Achieve Digital Transformation

IBA Group
Mark Hillary

Cloud computing has transformed how the enterprise uses both storage and applications. Local storage is no longer a requirement with remote server farms immediately managing any requirements and applications using cloud resources to cope with heavy loads – such as online retailers on particularly busy shopping days. All these benefits of the cloud have become normalised, but could we be achieving more with cloud-based services?

A recent feature in ZDNet suggests that we can – digital transformation. Most enterprises view digital transformation through the lens of a change in business model, for example a retailer with a focus on in-store sales shifting their attention to apps. However, the ZDNet research shows that enterprises with a requirement to quickly change business model or offer an improved digital service could be more effectively exploring cloud-based services.

The research group IDC claims that spending on digital transformation in 2017 topped $1.3 trillion, so this is a market where a lot of enterprises are spending, but are they getting the best value and results from that investment?

The problem in many cases is that the technology makes the problem look simple. The cloud, Big Data, the Internet of Things (IoT) – on the surface all these technologies are easy to understand. Executives can sit in the boardroom and proclaim that we need a business strategy that builds on the IoT, yet it is one thing to declare that you want to use a certain type of technology and another to figure out how to make that an integral part of your business in a way that delights your customers.

This is where ZDNet believes that a focus on the cloud can pay dividends. Too many enterprises do not have an efficient or well-structured IT department. There are disparate apps spread across different storage facilities and no easy way to share and analyse data across all the business applications being used. Rather than focusing on technology-led solutions, such as what can we do with AI or the IoT, it makes more sense to fix this central core of the organisation by using cloud-based services. Fix the platform and your applications and data use will naturally transform.

Secure cloud access is a proven technology and it is simple to deploy if you hire a good partner with cloud expertise. The initial focus can be on storage, but you can build out from there to centrally manage business applications and data.

This approach can drive digital transformation in your business, not because it is a radical change to manage storage or apps in this way, but because once you organise your systems around this central core, with strong business application support and the ability to share data across systems, you will be able to quickly design new solutions that could only ever work in this environment.

Sometimes it is better to build the platform and let the solutions arrive through innovation, rather than trying to build the solution on day one. Once you have a platform in place that allows data sharing, then the ideas will flow and your teams will automatically start exploring digital transformation options that are driven by business needs – not just technology.

IBA Group has been exploiting the capabilities of cloud computing, as well as other CAMSS technologies (cloud, analytics, enterprise mobility, social media, and security) to provide end–to–end business transformation services to our customers. In 2015, IBA Group opened its data center that offers various types of cloud services, including IaaS (Infrastructure as a Service), PaaS (Platform as a Service), and SaaS (Software as a Service).

How Tech Plays An Important Role In Delivering Great CX

IBA Group
Mark Hillary

I recently visited IBA Group in Minsk and I had the pleasure to speak with Andrei Lepeyev, the director of software development at IBA Group. As someone who studied software engineering at university myself, I’m always fascinated by the way that platforms such as cloud computing and the app store have changed what it means to deliver software, so it was really great to catch up with Andrei.

You can hear our conversation on the CX Files podcast by clicking here or search your favourite audio podcast provider, such as iTunes, Spotify, SoundCloud, or Stitcher. Because we were focused on CX we talked about some of the technologies and systems that Andrei is working on that have a direct impact on the quality of CX for the clients of IBA Group.

I had initially asked Andrei about how Artificial Intelligence (AI) is being used to predict customer behaviour, but he explained to me that IBA has gone further and created a product called APPULSE that offers a complete Level 1 and 2 support service for mainframe computer systems.

Andrei said: “APPULSE not only detects the system and finds problems, it uses Machine Learning to learn about the solutions so in many cases it can create a self-healing mainframe system. Mainframes are still really important and unbeaten in the range of directions they are deployed. They are the most stable and virus-free systems, but their user-interface is not usually so good.”

Andrei was talking about the importance of keeping mainframes running because they are often overlooked by most customers, yet your bank will be relying on those systems to be running if they want to offer a 24/7 online banking platform. Ensuring that the system can heal itself before problems even happen is an enormous improvement in the way that a traditional IT support operation would run – fixing problems only after they have caused a problem. That’s always a disaster for customers who need service now.

One of the big trends for 2019 in CX will be Robotic Process Automation (RPA). Andrei explained to me that IBA Group has delivered implementations all over the world using the top four RPA platforms so they are not just riding a wave of RPA hype, they have real customers and case studies from numerous countries. But I asked Andrei how they choose the best platform for different customers – is the software really very different?

He said: “First we think about the support level of each supplier. Can they provide education or trial systems? Can they add specific requests to the software? Can they give extra information to a company like ours that may be implementing the solution?” However Andrei also added an interesting point which is not often discussed in the industry – sometimes it is just which software system is seen first by the customer. He said: “It is also important to see how each of the companies is marketing their product to the client. Often we will be approached by a potential client who already has a pilot system – developed free by the software vendor – and it can be very hard to move them to another system even if we think it could be better.”

Andrei mentioned Machine Learning when describing the mainframe support system and I asked him about the popularity of this in 2019. Are more and more customers asking how to make their systems learn about customers and systems automatically?

Andrei said: “Yes, many more clients are asking about it. The main reason is that there has been an evolution of hardware. A simple mobile phone allows almost every standard machine learning platform to work. Ten years ago this was impossible. We are not talking about huge brands like Google and Amazon – even smaller companies can deploy a machine learning system today – there is a very low barrier to entry now.”

When I asked Andrei about his priorities for 2019 he said that he wanted everyone in the industry to remember that none of these technologies exist in a vacuum – they all need to interact with other technologies and business processes. He said: “When we are talking about AI we cannot talk about it alone, it should be the business application of AI. We can’t talk about RPA without Machine Learning. We can’t talk about Cloud Computing without talking about the solutions that are built and deployed on the cloud. I’m looking forward to some projects in 2019 that involve AI using RPA and are delivered on the cloud.”

My conversation with Andrei provided a great insight into how some of these technologies are really affecting the customer experience. A large amount of media coverage is just hype, but as Andrei demonstrated, there is a great deal of substance here. These technologies can deliver game-changing systems, but the companies using them to interact with their customers need to have great products and services – it is not the use of an RPA or AI platform alone that will help them to be more successful.

To listen to the CX Files podcast featuring Andrei Lepeyev from IBA Group, click here.

RPA Has Truly Arrived

IBA Group
Mark Hillary

When I recently visited IBA Group in Minsk I was expecting to hear about their Cloud Computing solutions and some of their more recent developments in Machine Learning, but I was surprised to hear exactly how developed their Robotic Process Automation (RPA) expertise is today. I was surprised because their approach to RPA is not typical. They have experience of delivering real projects to real clients in multiple countries using the top 4 RPA software platforms.

Now contrast this to the typical RPA story in the media. Robo-bosses, robots taking over, and other mentions of robots replacing humans. When reading about RPA we usually read hype and grand claims of digital transformation, often from experts or IT companies with very little track record in this area. Yet IBA has been quietly developing expertise in all the major RPA platforms all over the world and there is no hype at all. They have just been getting on with the job.

RPA is now (almost) a $2bn a year marketplace and it’s growing fast. This area of business is only going to get more important as we move into 2019. HFS predicts that we are looking at $2.3bn revenue in RPA technologies in 2019 and this will grow to $4.3bn by 2022.

Traditionally HFS has been the least bullish of the analyst community. For a long time they criticised analysts such as Gartner for hyping the RPA market, but now even the HFS predictions look exciting. Gartner predicts that spending on RPA software is growing around 57% year on year, which is a phenomenal increase for any market, but what is really interesting is how all the analysts seem to be agreeing that RPA is no longer in the Hype Cycle and is now being accepted as a regular business process automation tool. Even the forward projections of Gartner to 2022 feature year-on-year growth of 41% – RPA has arrived.

When I arrived at IBA, I never expected to hear such a solid RPA success story – case study after case study of real RPA deliveries. I did a detailed interview with Vjacheslav Mikitjuk, director of Internet Technologies, that I intend to publish in the new year.

The RPA world is full of hype. HFS Research has been a vocal critic of the RPA hot air and fake news for the past few years, but even they now acknowledge that there are real solutions being delivered that are adding value all over the world. I witnessed this up close when I went to visit IBA Group and it was not even something that I had expected. They have kept their RPA expertise fairly quiet, but I’m hoping to change that in 2019 by telling the world what they have been doing.

To listen to the CX Files podcast featuring Andrei Lepeyev from IBA Group, click here.

Mark's visit

What Does Your Digital Core Look Like?

IBA Group
Mark Hillary

The life of the CIO has been like a roller coaster in recent years. The strategic importance of information was probably only really appreciated in the 90s, when the CIO became a more common term than IT Director. It was then that the value of the information, and what a company did with it, became more important than the technology itself.

But in recent years, the CIO has seen cloud-based systems take over. So long as business teams had access to the Internet they could subscribe to pay as you go business services offering everything from CRM to ERP to data storage – software as a service. It seemed like the IT department was offering little of strategic significance for many companies, other than ensuring the business teams can access their Internet-based services.

Now catch up into the present and it seems that some organisations are thinking again about their IT infrastructure because the strong core approach is becoming a popular way to approach the way that technology is organised inside the enterprise. But what is the core, beyond just offering a secure network?

The core approach offers security, but also APIs into all business applications that the company uses, a single way to share data across applications, and a stable environment where automation/bots and tools such as Robotic Process Automation (RPA) can be applied. In short, the enterprise creates a core for data and applications and benefits from being able to share data across teams. This also offers the opportunity to automate processes and create efficiencies that are impossible if each individual department is just deploying cloud-based business solutions.

The digital core should in fact be a core for driving improved customer engagement with the business. By creating opportunities to manage enterprise data more effectively, insights can be created and customers can experience a far more personal service – the enterprise finds efficiency, but the customer experience is also improved.

Building a core requires a consistent approach to building a central platform, sharing APIs, applications that can work together, and data that can be shared and analysed. It requires an enterprise-wide approach to managing data and applications, which sounds a bit like the old days of central control from the CIO office, but the insights and efficiencies that can be achieved from this approach should outweigh any loss of autonomy for individual business units. In fact, individual business managers can continue to select and deploy their own software solutions so long as they can be plugged into the core system. Flexibility should still be promoted.

We are moving back to an environment where the CIO matters once again. Have you explored the core in your own enterprise yet?