We have all seen the numbers related to the Internet of Things (IoT). 50 billion devices will be connected by 2020 with $19 trillion of business opportunity. It’s a big deal.
But, there is a hidden side to the numbers that a new feature in Forbes magazine has explored. Ten different industry groups are trying to define standards and frameworks for the IoT. Six companies that employ 780,000 people, and have net annual sales of $428 billion, are almost entirely controlling the entire industry.
Forbes suggests that it’s like a giant casino. Each of these big technology companies is presented a united front to their clients, but behind the scenes everyone is fighting to create a dominant position on the bodies that are defining standards.
Wars over standards always break out when new technologies come along. Who can forget the old days of VHS vs. Betamax video? However, this time the prize for being a dominant player is enormous.
The problem here is that the Internet of Things is a concept or strategy. It requires the creation of an entire ecosystem that involves both hardware and software. If every device needs to be connected to every other device then manufacturers across several different industries need to start working together.
The IoT isn’t a foregone conclusion. It will only work with devices that can interact with each other. If large technology companies only see the market opportunity and start battling for turf before even agreeing on how the IoT can and should work then it may never work.
We may end up in a situation where only certain products can link up to others, or worse still, you need to buy everything from a single manufacturer to be able to create a connected environment. It’s time for the big companies in this market to really start working together, rather than just joining trade bodies and pretending there is unity.
As the Forbes features says: “Who will win with this strategy? It won’t be us.”
28. 08. 2015
Finally, legislation is catching up with technology. Nevertheless, it is still very important to provide the safety of storage and to secure the validity of electronic documents. The article by Aleš Hojka, IBA CZ General Director, reviews what the possibilities in the market are and what should be considered before purchasing a DMS (document management system).
In recent years, we often use the term “e-document” when talking about document creation and storage, as well as about document lifecycle management. No doubt that in the ECM / DMS area, it is a quickly emerging trend.
The fundamental element of this problem is the shift of legislation, not only our Czech, but European as well. For example, the Regulation on electronic identification and trust services for electronic transactions in the internal market (eIDAS) clearly states that an electronic document has the same legal force as a paper document and that no authority in any of the EU Member States can decline a document just because it is not a paper one.
Providers of ECM / DMS systems and process consulting should also keep up with the times, as customers mainly appeal to document archiving or so-called reliable archive. Most frequently, we come across e-documents in banks and insurance companies, when the “hallmark” of safe storage, archiving and document inalterability is supplemented by, for example, a digital signature, certificate or biometric signature.
So how to efficiently not only carry out document processing and long-term archiving, but also to ensure their legal validity, so that formally and meaningfully they have an equal permanent and evidential value as their paper counterparts? First you need to look at the specific requirements of the customer and then to offer a suitable software, or an integration of multiple systems. A wide range of options pops up in mind from so-called enterprise solutions through other cheaper alternatives and up to open source solutions.
You need to approach every customer, either new or already existing customer, individually and offer him/her an optimal solution. Unfortunately, we come across more and more customers who were recommended an unreasonably large (enterprise) system and who are now facing the deployment of other agendas, which development and integration is much more complicated and expensive than it could be in case of simpler DMS system alternatives. Not every customer needs a robust solution, but the reverse is also true when certain requirements cannot be completely covered by the open source solution. While implementing the system, it is always necessary to bear in mind the total cost of ownership.
If customer requirements are well-defined we should rely on our experience when recommending the appropriate DMS system. The storage may be used for archiving documents, and unless there is no requirement for workflow processes or internal robust integration with other systems, we shouldn’t be afraid of using proven technologies such as MS SharePoint Foundation, Alfresco CE, ELO, as well as other open source solutions that are able to cover these requirements completely.
Another category is the implementation of a system where the customer expects the emphasis on speed, personal processing engine, ability to integrate with other systems, power, scalability, etc. In this case, we choose enterprise systems like IBM FileNet, EMC Documentum, Microsoft SharePoint or OpenText.
by Irina for IBA Group
Posted on September 23, 2015
In November 2014 I was lucky enough to be invited to Minsk to visit the IBA development centre. I visited with the Ovum analyst Peter Ryan, who was over from Canada and possibly not feeling quite as cold as I was, after arriving from the Brazilian summer.
One of the projects I remember most from that visit was the ticketing system for the Minsk public transport system. IBA designed a complete solution for the bus and metro network that would comprise the cards, the card readers, the recharging systems, and all the software needed to make the system function. It was far more than just a software project and really showed how companies need to approach business solutions rather than technical challenges.
But the most interesting thing about the public transport system was not how it was delivered; rather it was what could be learned after implementation. Suddenly Minsk City Authorities had access to information on every bus of metro ride taking place across the entire city. When, where, and how journeys were being made was suddenly all being recorded and could be analysed.
The reality was that the data created by the software and hardware system was probably worth more than the system itself.
I was reminded of this when I saw in Forbes that commuters in Stockholm, Sweden, will soon be able to access similar data on the travel patterns in their own city. With the data on Stockholm travel passing through a Big Data analysis engine, it should be possible for commuters to see what will be happening on the public transport system two hours in the future.
This ability to predict the future will allow customers to change behaviour and avoid hot spots. Naturally this will change the predictions, but the system will be able to revise predictions in real-time.
Some commuters have complained about the move away from paper tickets and cash payments, but when anonymised commuter data can be collected and analysed in this way, new benefits become clear. I know that I would love to be able to see how the transport system will look where I live, even just one hour in the future.
Companies with Big Data expertise and city governments have the power to make the life of commuters so much easier – let’s hope more cities copy the example of Stockholm and Minsk.
New data published this month in Logistics Manager magazine has indicated that nearshoring is a strategy that is gaining in popularity – particularly in Europe. In their data, 56% of respondents indicate that they favour “rightshoring” over offshoring to the lowest possible cost location.
The Logistics Manager data is focused mainly on manufacturing businesses ensuring that their manufacturing facilities are as close as possible to the end customers while also balancing production costs, but the strategy can be applied equally to other industries. The reason for this is that what is actually changing is the way we manage supply chains.
Any hand-off of a process between different departments, or from one company to another, takes time and involves risk. This is true of services and manufacturing. When a process is passed between teams internally there is a risk of failure during the transition. This risk is multiplied when offshore outsourcing means that a process has to be handed to another organisation in another location – often far away on a different time zone.
The traditional metrics that decided how to organise an outsourcing strategy focus on three areas; what is the cost? Can the quality be maintained or improved? Can the time required to deliver be reduced so factors such as time-to-market can be improved?
Nearshoring has always had an advantage over more general offshoring in all of these metrics, except for the cost. But with a renewed emphasis on the supply chain, it may well be that the correct focus is the quality of the team, or service, anyway.
Low cost services are no good to any company if they don’t work. Imagine a luxury goods retailer using the cheapest possible contact centre company for the customer service? How would that reflect on the brand? Likewise, an innovative drug company would not want the cheapest, least innovative, technology service. What companies really need when they outsource today is a partner.
Partnership is what sales teams used to talk about before the sale, but it’s become a reality because what has actually happened is that companies providing different services into a supply chain have really become a part of the team. Outsourcing has become a standard strategy and companies have got so good at doing it they slot in and work as if they were a part of the client.
This means that nearshoring – as opposed to focusing on the cost only and offshoring – takes on a renewed importance. If you want a better price for services, but not necessarily the lowest, because the quality and time factors are most important then European companies working with partners in Europe becomes the ideal business solution.
What has really happened as outsourcing has matured is that the boundary of the organisation has become flexible. Instead of thinking of the client with a supplier, it’s better to now just think of the client having flexible organisational boundaries that include some of their own team and some suppliers – but whoever actually does the work, they all exist within the organisational boundary of the client.
This is a big change and it certainly makes nearshoring an attractive option. Have you considered the differences between nearshoring and offshoring in the context of the way that company supply chains have changed in the past few years? Please do leave a comment with your own ideas here.
I have often written here about the Internet of Things (IoT) and the promise it holds for various industries. Moving on from the cliché of the fridge that knows when you are running out of food, the IoT holds enormous promise for offering the ability for objects and systems to make pre-emptive decisions.
The car is a great example. We have long had warning lights on cars to indicate particular problems such as low oil or a high engine temperature, but by gathering data and having the ability to analyse it as a whole, cars are getting far more intelligent and able to self-diagnose before problems become serious. This becomes even more critical when public or commercial vehicles can perform this kind of self-diagnosis. For example a Boeing 777 can generate twenty terabytes of data per engine per hour – a fantastic resource for monitoring that all systems are functioning normally.
Personal fitness is another excellent example, and being a runner I really like the ability to be able to look back at previous runs. I can see not only the distance covered, but also the weather, the location, and the hills I tackled. I can use this information without additional processing just to plan new runs or I can crunch the data to get a great insight on my own performance.
But is there a downside to all this data collection and could it cause individuals to reject certain products or services?
Facebook Local Awareness advertising allows businesses – especially retailers – to advertise to Facebook users that are geographically close. The business creates an advert and targets it to a particular demographic – a specific age group or people who like certain products – then Facebook takes care of ensuring that when suitable people are geographically close to the business they are served an ad on their phone.
To some people this sounds great. If I’m the kind of person who likes shopping for fashionable clothes and as I walk past a boutique they send a 25% off voucher, but only with validity for the next hour, then I might stop by and buy something. To other people this is going to feel like an enormous invasion of privacy. Not only is a social network – in this case Facebook – monitoring what I like and dislike to build up a profile of me as a consumer, but now they are monitoring where I am too.
A recent feature by Irish technology expert Maria Farrell in The Guardian argued that by 2020 over 100 billion individual devices would be connected to the Internet. With around 7 billion people on the planet that’s around 14 online devices for every person. If anything, I believe that is a conservative estimate given the rate of change.
The implications for this are clear. Even if you don’t agree with the way that an organisation is using your data – perhaps like the retailer example – most people believe that there is nothing they can do. We have accepted so many services as ‘free’ knowing that we pay for them with our data and now that we have come this far there is almost nothing that can be done to reverse the situation.
In The Guardian, Farrell argues:
“The unholy alliance of CCTV, face recognition, mobile phones, fitness trackers and other wearable technologies, data brokerage and analytics, private ownership and control of previously public spaces like city squares, and increasingly wide-ranging policing powers mean we live in an urban world of ambient surveillance we never voted for. We are no longer citizens enjoying civic space; we are crops to be harvested, we are potential risks to be controlled. The internet of things does all that for us and more.”
The implication is that data will not always be used in the way we assume it might be. Health trackers that monitor runs might be informing health insurers about how you are looking after yourself. Cars might not just be self-diagnosing problems, but also telling your insurer if you drive aggressively. Credit scoring agencies might be building up a picture of your likes, dislikes, and habits in addition to spending patterns. Employers may be monitoring your every arrival, departure, and keystroke at the office.
Only the powerful can argue against this. The people who need a job or need to drive a regular or need a health insurer cannot refuse the terms and conditions that are demanded. If a health insurer demands that you offer information on your health habits in return for insurance cover then what can the average person possibly do to protest?
The Internet of Things has many clear benefits to society, but it is this question of data use and privacy that will cause many doubts to surface. Some have rejected social networks because they want to avoid sharing too much information about their life, but when information sharing becomes a condition of employment or insurance, it will be impossible to avoid.
Can we handle the implications of a world where everything is known about you as a person or is there still time to preserve some privacy?
In July this year, I ran 153km. My doctor would probably be pleased to see this, as it’s quite a good average of about one marathon every week. I was hoping to do even better in August, but I’ve not been very well so my figures are lower.
I know the distance covered because my iPhone has the Nike+ app. Not only is it recording the distance each time I run, but I can see where I went, the hills covered, the split times, average pace, the weather, and the type of terrain covered. Add an extra accessory, like the Apple Watch, and I could be tracking and recording my heart rate too.
Tools like this are the reality of concepts that we all see in technology journals – Big Data and the Internet of Things are two concepts that we read about all the time and yet they are often undefined or unclear.
Take another example related to health. The games company Nintendo is in the process of launching an entire suite of products related to “Quality of Life”. The first one is a box you can place by your bed as you sleep. It does not need to touch you, it just needs to be close, maybe by your bedside light and book.
The device monitors you as you sleep. It gathers data over time and compares it to other people and can recommend how you can improve your health. The device can give concrete advice (usually related to exercise or food) based on knowledge of the way people sleep and it does not even need to be worn.
All these devices are capturing enormous amounts of data that we never used to capture. In theory, open sharing of health-related data with health professionals should make their life easier and improve diagnosis for patients.
But it’s not always as easy as the technology suggests. A new paper in the IEEE Journal of Biomedical and Health Informatics explores Big Data use in healthcare and why it is taking longer than expected to achieve the promised benefits.
The real challenges are:
1. There is just so much data that is being stored. Most individual healthcare providers don’t know what to do with all the data they have – at that is just at individual company levels.
2. Finding a way to use the data is difficult. Most healthcare managers are not experts in the concepts of IoT being used to create data and Big Data expertise allowing the study of enormous databases.
The “Holy Grail” for healthcare providers is to be able to create an “Electronic Health Record”… a single key that then allows every possible piece of information on a single patient to be indexed. This would include traditional patient notes, but also any X-rays, MRI scans, performed over a lifetime. It would also include extra information, such as sleep patterns from a Nintendo device, exercise records from a Nike device, and a record of your pulse from the moment you are born to when you die.
Technologically we are there already. Mainstream equipment such as smart phones and smart watches are already making the data collection possible, but can the healthcare companies actually make sense of all the information they can access?
At present no, but it goes to show that healthcare is about to be one of the growth industries of the century. Populations are getting older and information technology is blending with normal life in a way that nobody could have imagined a decade ago.
Big Data and the IoT have a real and definable purpose in healthcare. Where do you think the next big healthcare innovation will take place?
I have talked a lot about Big Data on this blog. It is a technology that is now becoming normal and accepted in the enterprise, largely because of two factors:
1. The Internet of Things (IoT) means that every electronic device is becoming connected. Even light bulbs can now be assigned an IP address so you can connect them to a home control system. All these connected items generate vast amounts of data…
2. Consumer behaviour and their relationship to brands has been entirely reversed in the past five years, from brands offering a way to get in touch to consumers defining exactly how they want to review or criticise products. Now brands need to seek out comment and to engage wherever the customers are located.
There are many more factors, but I believe that these two broad changes are responsible for creating enormous amounts of data – amounts that seemed unfeasible a decade ago.
The industry analysts support this view. Ovum recently announced their own research, which indicates that from now until 2019 they predict that the Big Data market will grow 50% each year. Compounded annually this means that by 2019, the market for Big Data software and expertise will be six times bigger than it is now.
Six times. That’s a lot of market growth. The Ovum Big Data Practice Leader, and co-author of the report, Tom Pringle, said: “The experimental era of big data is coming to an end, organizations are formalizing their use of big data technology to realize the business value they expect to find.”
The important factor to note here is that Ovum is suggesting that the time for experimenting with Big Data is over. Many companies have tried it, toyed with open source software and systems, and experimented with the insights they can gain from Big Data analysis, but it is now proven that many companies need these insights.
The time has come to call in the experts.
Think about the consumer technology that you regularly use today. You probably have a smart phone, maybe a Kindle or other e-reader, maybe an Apple Watch or similar device that can access information from your phone. Maybe your car can hook up to your phone to offer in-car information. Maybe you have an Amazon Echo at home so you can access the Internet just by speaking?
All these consumer devices are available today and are accepted as normal. Most consumers expect to have a device that gives them 24/7 access to all the services and information that the Internet can offer.
So why isn’t enterprise technology like this? Many companies still issue phones that are not even smart and laptops that are too heavy to really be portable. The concepts of cloud computing and app store flexibility remain conceptual in many organisations. Why?
The obvious answer is that consumers have far less to invest than large companies. When purchasing technology, a CIO needs to set the agenda for several years. If things change during that time it can be difficult to shift direction or to keep up with the change. Individuals don’t face this problem.
This has led to the popularity of Bring-Your-Own-Device (BYOD) policies in many companies, where employees are offering cash to use their own equipment instead of what the company can supply.
But a small change in the strategic mindset can also have a major benefit to the enterprise. Commissioning new software solutions as apps, rather than desktop tools can encourage the workforce to be mobile. This can even encourage companies to create entirely new solutions for customers.
An app developed by IBA for use by a bank in South Africa allows bank employees to sign up new customers on the move. They can photograph the customer using their phone and capture details which are then shared with the central system of the bank – no forms, no waiting for an appointment. The new customers, the mobile bank employees, and the bank executives all benefit from the app approach.
It used to be that enterprise technology was years ahead of what people had at home, but now the reverse is true. It’s time for more company executives to take inspiration from the tools they use everyday – how can we use mobile devices and other common personal technology to create better business solutions for our customers?
In my last blog I mentioned that Big Data has progressed far beyond just being a business buzzword. There are entire industries being shaken to their core because a leading player finds a way to analyse their customers better than their rivals. Far from being a management trend, this is a strategy that will fundamentally change many industries.
But have you ever stopped to appreciate just how much data we are creating today?
Some excellent analysis in Business Insider recently explored this question. The problem is that people and companies are just creating so much data – it is increasing at an exponential rate. At the present rate we are doubling the amount of digital data that exists every two years.
But even though we are creating and storing these enormous amounts of data, only 0.5% of it is being analysed. There is so much data out there that companies, governments, and individuals feel swamped, unable to gain insights from it.
The Business Insider article features a comment that cuts to the heart of the Big Data issue: “You have to start with a question and not with the data,” says Andreas Weigend, former Chief Scientist of Amazon, now director of the Social Data Lab and lecturer at UC Berkeley.
Businesses need to start thinking about the insights they could get from their customers, to ask more ‘what-if’ questions. There are solutions out there in the data, but it is impossible to analyse every byte of data.
A typical analogy for the average person might be the difference between email and Twitter. You check every email, even if it is only long enough to decide that it should be deleted. However, you only check Twitter messages that are arriving as you are looking at the news stream, or you use intelligent filters and analysis to ensure that interesting messages are made visible.
Businesses need to start thinking of their Big Data strategy in the same way. How can insights be drawn out from the data they already have?
Big Data is still just a buzzword for many people. Magazines and newspapers that do not cater strictly to a business audience continually need to explain what they mean when talking about the subject and the strong association with technology means that even some business leaders are still unaware of the true benefits.
But as with all technology projects and ideas, if they can be associated with actions that can improve a business, make it more efficient, deliver services faster, or create new products before competitors, then the leaders can see the advantage.
Forbes magazine recently documents a few examples that demonstrate some of the advantages. Carnival Cruises needs to plan the best way to serve passengers in much the same way as an airline does. However a cruise is a much longer journey than a flight and across all their ships and passengers, Carnival has 80 million cruise days per year. If they could just earn $1 per day extra from each passenger then that’s an immediate $80m boost to revenue.
That’s an example of how to analyse customer behaviour so products and services can be targeted more effectively. Retailers still do this with loyalty cards, although the idea of a loyalty card has been falling from fashion in recent years – customers are tired of giving away their personal data in return for very small benefits.
But data can also help to save money and improve service too. The Australian telco Telstra uses Big Data analysis on their entire network with predictive analysis so potential faults on lines, and in specific areas, can be identified before they happen. Outage time is reduced, engineers can be moved into position faster, and not only does the company save on maintenance, but the customer is happier too.
Every big business uses data today. Every business has the opportunity to analyse this data in a more effective way. There is always information available if you know how to dig deep into the data you have.