Big Data is growing up – finally. That’s the conclusion of new research published recently by the Economist Intelligence Unit (EIU). The research details how corporate attitudes to data have changed in the past four years – with many organisations now seeing data itself as a corporate asset.
Instead of constantly seeking more data, companies are asking the right questions. They are seeking the right data that can help decision support, rather than measuring and capturing everything regardless of use.
This strategic alignment with a more intelligent approach to data often comes with the elevation of a data manager to the executive board. Either the role sits with the CIO/CTO or a new Chief Data Officer role is created to ensure that there is always a view on data value at the top table.
What is particularly interesting for managers who are asked to invest in Big Data projects is that there is a link between a well-defined data policy and financial success. Not only does a well-defined data policy correlate with business success, but also the effectiveness of being able to resolve real business problems through more effective data use.
Commenting on the EIU research in Forbes magazine, Bernard Marr, author of the book “Big Data”, said:
“As technology continues to improve, the ‘bigness’ of big data will become less and less of a factor. Companies are becoming more comfortable with the idea that they will need to scale up to allow the value of data initiatives to reach all sectors of the business, and so they are becoming more comfortable with approximation, agility and experimentation.”
I agree with Marr. We can see from this EIU research that more companies are exploring how to use Big Data, but importantly more are finding a genuine business reason or use. As more companies find these reasons to get more engaged the use of Big Data will explode in size – all over the world.
On November 15, the IBA Group team won the first place at the 12th IT Spartakiada sport competition. The multi-day event included matches in basketball, volleyball, karting, bowling, table tennis, and kicker.
Nineteen teams from various IT companies participated in IT Spartakiada, with the first competitions starting in September. IBA Group won gold in karting and bowling, as well as bronze in kicker and volleyball. The table tennis and basketball teams finished fourth and fifth respectively.
Winners of IT Spartakiada are determined by the minimal sum of points scored by teams in their top five sports.
Later on November 28, Miss IT 2015 was held a part of the IT Spartakiada. Evgenia Sudakova, a software engineer from IBA Gomel represented IBA Group. Her choreographic piece impressed the audience and jury by its plasticity and artistry.
The originality of Evgenia’s image, her erudution, and ability to present herself were praised by the jury, which named her the winner of Miss IT 2015 and awarded the young woman a well-deserved crown. It was the first crown for a contestant from Gomel.
We congratulate the IBA Group team and Evgenia on their titles and wish them more success in the future.
This year, IBA CZ has successfully delivered two projects that became another step forward in the area of portals. The projects were implemented for the government sector and included solutions to manage the objects of cultural heritage.
Although library portals are our new skills, we are not first-timers in the area of digital cultural heritage. Since many years, we have gained a profound experience of working with records of digital cultural heritage, for example, during the project implemented for the Police of the Czech Republic. It was the portal of the artworks’ registry system and implied the processing of information about stolen and found objects of cultural value. So we had a good knowledge of cultural valuables and a deep knowledge of police systems. The extension to another specific area was just the next logical step.
Currently, only a few companies work in the area of library systems. The information systems they were deploying for quite a long time in all institutions, libraries, and museums are rather static and are viewed as legacy systems today. Considering our deep knowledge of portals and previous experience in similar systems we brought some fresh air into the world of library information systems. And we succeeded.
In a situation when we were providing proof of our experience in the sector, it was important to understand what the library systems are. Therefore, within the partnership with Masaryk University in Brno we worked with external experts to understand better the specific requirements and characteristics of these systems.
In fact, every institution that owns a collection of books, museum showpieces or any other collection is trying to catalog them. But everyone is doing it in its own way. Figuratively, we can say that at first there was a clay tablet, then came a papyrus followed by a parchment, paper, and finally by a digitized information system.
There is no need to say that the uniformity of data is at a “very low level”. Special purposed protocols and standards were supposed to improve the situation but they are past their prime.
For now, nothing better is available yet. (Please forget about web services). In addition, the institutions want to exchange the information about their collections.
As soon as we understood and learnt by experience what library systems were about, it was enough to cope with the existing implementations of these non-traditional technologies and to integrate the whole thing with the portal. It was not easy but thanks to the dedication of the whole team we came to a successful end.
The web portal of historical collections is currently running on the Liferay platform providing the search through 50 castle libraries and ten other large-scale library institutions. To see, please visit https://histfondy.npu.cz/.
The portal of digitizing the cultural heritage literally displays the collection of books and collection of items from the Vysocina region. To see, please visit http://digitalizace.kr-vysocina.cz/.
For example, the above mentioned project is called PSEUD and is based on the IBM WebSphere technology.
These two projects were for all members of the team of a great benefit both from the technological and project point of view. We proved ourselves and demonstrated to our customers that we are able to work with the records practically about anything from books, paintings, sculptures, and clocks to weapons and jewelry.
We were able to create and configure the portal so as to provide easy search not only for scholars and museum keepers but for police, ministry officials, and general public as well. And certainly, no one can get illegal access to the information belonging to others. In addition, we were able to integrate the portal with the databases of ministries, Interpol, the National Library, as well as the municipal museum “somewhere in the mountains.”
If you are interested to obtain more information on the portals of digital cultural heritage, please contact Jan Schuma (firstname.lastname@example.org).
Concerns have been mounting in the Internet of Things (IoT) recently. Equipment manufacturers have been tussling over standards prompting some to believe that a ‘Betamax’ situation may be created where some devices cannot connect to the IoT grid.
If such a situation occurs it could seriously impact the adoption of Big Data projects. Big Data does not depend on the IoT – there are many other types of large database – but the constant flow of IoT data means that most IoT projects will also require a Big Data element.
However there is some good news from the analyst community. New data from IDC suggests that the growth rate for spending on Big Data between 2014 and 2019 will be just under $50bn – that’s compounded growth of 23.1% each year.
The real elephant in the room for the Big Data market is the security of collected data. There have been several damaging data leaks by major organisation in recent months. The danger for companies that are collecting large amounts of data is that leaks of private data will cause brand damage so serious that companies could even face an existential threat.
IDC believe that large companies are aware of this danger and are planning their Big Data infrastructure with security in mind.
“The ability to leverage big data and analytics to develop an integrated view of customer activities and business operations will provide competitive differentiation to companies across industries,” said IDC programme director Jessica Goepfert.
“However, in addition to the huge opportunities, big data presents some significant risks and liabilities to organisations. Line of business and IT executives will need to approach these ongoing challenges with awareness, flexibility, adaptability, and responsibility.”
This is an area of the technology business that is growing by around one quarter every year right now. There will need to be some big mistakes to derail this market, but it is possible. The constant stream of security stories in the media shows that the public are more aware than ever of the dangers ahead. Companies adopting Big Data need to ensure they are always one step ahead of the data thieves.
The industry analyst Gartner Group has issued a list of ten technologies to watch for 2016. These are the trends that the analyst firm believes will be shaping the digital agenda next year.
You can click here to go to the Gartner newsroom where they list all of their predictions, but here I want to comment on what I see as their top three.
Adaptive Security Architecture
If a CEO today is not aware of the importance of security then their board should be asking how they got the job. Major companies are now facing existential threats because technology systems were hacked. Consumer companies with personal data on millions of customers are particularly at risk and one hack can destroy many years of trust in a brand. Making security smarter, tighter, and more able to adapt to changing attack methods will be an enormous trend in 2016.
The Internet of Things (IoT)
Despite recent suggestions the IoT is stalling because there is still no single agreed standard, I believe that there is enough momentum in this trend to start creating a significant amount of work. Naturally this connects to an increased need for expertise in Big Data analysis if IoT is creating enormous amounts of data.
Machine learning is getting smarter. People laughed at Apple Siri when it was first launched, but have you tried it recently? Intelligent agents have improved enormously. The Amazon Echo system replicates Siri in the home, allowing a user to ask questions from anywhere in the home. Similarly machine intelligence is set to revolutionise customer service operations as the most common enquiries are recognised and handled by robots – Robotic Process Automation.
As always, the Gartner predictions are interesting, but after checking the complete list of ten, which would you pick as your top three?
by Irina for IBA Group
Posted on October 20, 2015
One of the key advantages for brands mining Big Data is the information it can reveal about their customers. Trends can be spotted and in many cases actions by customers can be predicted before they take place.
This is particularly applicable to financial services because records of financial transactions are thorough. Financial service companies can use customer behaviour to predict the best time to offer a new product – such as a loan – or even when a customer might be struggling and about to default on financial commitments.
But European regulators are pushing back, concerned that if companies can analyse data and create predictions that can be used to sell additional products then it may be seen as an invasion of privacy by some customers.
The three EU financial regulators – the European Banking Authority, European Securities and Markets Authority, and European Insurance and Occupational Pensions Authority – have joined together to study the effect of Big Data on customers in Europe. There is no formal data announced for their report, but they have indicated that they will be studying Big Data closely for about the next year.
Banks and insurance companies have many legitimate uses for Big Data that go beyond just marketing alone, fraud prevention for example, so it will be important for companies using Big Data to explain the benefits to the regulators over this coming year.
If the regulators conclude that analysing data in this way is invasive, it could create a problem for many banks that are now investing heavily in this technology. It’s up to the industry to demonstrate their need.
We have all seen the numbers related to the Internet of Things (IoT). 50 billion devices will be connected by 2020 with $19 trillion of business opportunity. It’s a big deal.
But, there is a hidden side to the numbers that a new feature in Forbes magazine has explored. Ten different industry groups are trying to define standards and frameworks for the IoT. Six companies that employ 780,000 people, and have net annual sales of $428 billion, are almost entirely controlling the entire industry.
Forbes suggests that it’s like a giant casino. Each of these big technology companies is presented a united front to their clients, but behind the scenes everyone is fighting to create a dominant position on the bodies that are defining standards.
Wars over standards always break out when new technologies come along. Who can forget the old days of VHS vs. Betamax video? However, this time the prize for being a dominant player is enormous.
The problem here is that the Internet of Things is a concept or strategy. It requires the creation of an entire ecosystem that involves both hardware and software. If every device needs to be connected to every other device then manufacturers across several different industries need to start working together.
The IoT isn’t a foregone conclusion. It will only work with devices that can interact with each other. If large technology companies only see the market opportunity and start battling for turf before even agreeing on how the IoT can and should work then it may never work.
We may end up in a situation where only certain products can link up to others, or worse still, you need to buy everything from a single manufacturer to be able to create a connected environment. It’s time for the big companies in this market to really start working together, rather than just joining trade bodies and pretending there is unity.
As the Forbes features says: “Who will win with this strategy? It won’t be us.”
28. 08. 2015
Finally, legislation is catching up with technology. Nevertheless, it is still very important to provide the safety of storage and to secure the validity of electronic documents. The article by Aleš Hojka, IBA CZ General Director, reviews what the possibilities in the market are and what should be considered before purchasing a DMS (document management system).
In recent years, we often use the term “e-document” when talking about document creation and storage, as well as about document lifecycle management. No doubt that in the ECM / DMS area, it is a quickly emerging trend.
The fundamental element of this problem is the shift of legislation, not only our Czech, but European as well. For example, the Regulation on electronic identification and trust services for electronic transactions in the internal market (eIDAS) clearly states that an electronic document has the same legal force as a paper document and that no authority in any of the EU Member States can decline a document just because it is not a paper one.
Providers of ECM / DMS systems and process consulting should also keep up with the times, as customers mainly appeal to document archiving or so-called reliable archive. Most frequently, we come across e-documents in banks and insurance companies, when the “hallmark” of safe storage, archiving and document inalterability is supplemented by, for example, a digital signature, certificate or biometric signature.
So how to efficiently not only carry out document processing and long-term archiving, but also to ensure their legal validity, so that formally and meaningfully they have an equal permanent and evidential value as their paper counterparts? First you need to look at the specific requirements of the customer and then to offer a suitable software, or an integration of multiple systems. A wide range of options pops up in mind from so-called enterprise solutions through other cheaper alternatives and up to open source solutions.
You need to approach every customer, either new or already existing customer, individually and offer him/her an optimal solution. Unfortunately, we come across more and more customers who were recommended an unreasonably large (enterprise) system and who are now facing the deployment of other agendas, which development and integration is much more complicated and expensive than it could be in case of simpler DMS system alternatives. Not every customer needs a robust solution, but the reverse is also true when certain requirements cannot be completely covered by the open source solution. While implementing the system, it is always necessary to bear in mind the total cost of ownership.
If customer requirements are well-defined we should rely on our experience when recommending the appropriate DMS system. The storage may be used for archiving documents, and unless there is no requirement for workflow processes or internal robust integration with other systems, we shouldn’t be afraid of using proven technologies such as MS SharePoint Foundation, Alfresco CE, ELO, as well as other open source solutions that are able to cover these requirements completely.
Another category is the implementation of a system where the customer expects the emphasis on speed, personal processing engine, ability to integrate with other systems, power, scalability, etc. In this case, we choose enterprise systems like IBM FileNet, EMC Documentum, Microsoft SharePoint or OpenText.
by Irina for IBA Group
Posted on September 23, 2015
In November 2014 I was lucky enough to be invited to Minsk to visit the IBA development centre. I visited with the Ovum analyst Peter Ryan, who was over from Canada and possibly not feeling quite as cold as I was, after arriving from the Brazilian summer.
One of the projects I remember most from that visit was the ticketing system for the Minsk public transport system. IBA designed a complete solution for the bus and metro network that would comprise the cards, the card readers, the recharging systems, and all the software needed to make the system function. It was far more than just a software project and really showed how companies need to approach business solutions rather than technical challenges.
But the most interesting thing about the public transport system was not how it was delivered; rather it was what could be learned after implementation. Suddenly Minsk City Authorities had access to information on every bus of metro ride taking place across the entire city. When, where, and how journeys were being made was suddenly all being recorded and could be analysed.
The reality was that the data created by the software and hardware system was probably worth more than the system itself.
I was reminded of this when I saw in Forbes that commuters in Stockholm, Sweden, will soon be able to access similar data on the travel patterns in their own city. With the data on Stockholm travel passing through a Big Data analysis engine, it should be possible for commuters to see what will be happening on the public transport system two hours in the future.
This ability to predict the future will allow customers to change behaviour and avoid hot spots. Naturally this will change the predictions, but the system will be able to revise predictions in real-time.
Some commuters have complained about the move away from paper tickets and cash payments, but when anonymised commuter data can be collected and analysed in this way, new benefits become clear. I know that I would love to be able to see how the transport system will look where I live, even just one hour in the future.
Companies with Big Data expertise and city governments have the power to make the life of commuters so much easier – let’s hope more cities copy the example of Stockholm and Minsk.
New data published this month in Logistics Manager magazine has indicated that nearshoring is a strategy that is gaining in popularity – particularly in Europe. In their data, 56% of respondents indicate that they favour “rightshoring” over offshoring to the lowest possible cost location.
The Logistics Manager data is focused mainly on manufacturing businesses ensuring that their manufacturing facilities are as close as possible to the end customers while also balancing production costs, but the strategy can be applied equally to other industries. The reason for this is that what is actually changing is the way we manage supply chains.
Any hand-off of a process between different departments, or from one company to another, takes time and involves risk. This is true of services and manufacturing. When a process is passed between teams internally there is a risk of failure during the transition. This risk is multiplied when offshore outsourcing means that a process has to be handed to another organisation in another location – often far away on a different time zone.
The traditional metrics that decided how to organise an outsourcing strategy focus on three areas; what is the cost? Can the quality be maintained or improved? Can the time required to deliver be reduced so factors such as time-to-market can be improved?
Nearshoring has always had an advantage over more general offshoring in all of these metrics, except for the cost. But with a renewed emphasis on the supply chain, it may well be that the correct focus is the quality of the team, or service, anyway.
Low cost services are no good to any company if they don’t work. Imagine a luxury goods retailer using the cheapest possible contact centre company for the customer service? How would that reflect on the brand? Likewise, an innovative drug company would not want the cheapest, least innovative, technology service. What companies really need when they outsource today is a partner.
Partnership is what sales teams used to talk about before the sale, but it’s become a reality because what has actually happened is that companies providing different services into a supply chain have really become a part of the team. Outsourcing has become a standard strategy and companies have got so good at doing it they slot in and work as if they were a part of the client.
This means that nearshoring – as opposed to focusing on the cost only and offshoring – takes on a renewed importance. If you want a better price for services, but not necessarily the lowest, because the quality and time factors are most important then European companies working with partners in Europe becomes the ideal business solution.
What has really happened as outsourcing has matured is that the boundary of the organisation has become flexible. Instead of thinking of the client with a supplier, it’s better to now just think of the client having flexible organisational boundaries that include some of their own team and some suppliers – but whoever actually does the work, they all exist within the organisational boundary of the client.
This is a big change and it certainly makes nearshoring an attractive option. Have you considered the differences between nearshoring and offshoring in the context of the way that company supply chains have changed in the past few years? Please do leave a comment with your own ideas here.