The Year That Was - the Martin version

Profile picture for user mbanks By Martin Banks December 27, 2018
Summary:
Martin's hot picks from 2018 on diginomica .

Screenshot 2018-12-19 at 11.58.52

Martin takes a look back at 2018 highlights from diginomica.

(1) Biz/Ops and incremental abstractions – moving up the food chain towards automated business operations

The need is to have software work perfectly – increasingly, it just has to work. But this is being set against a backdrop of operational and infrastructural complexity that is growing exponentially. The combination of cloud-based service, microservices and containers, when taken all together, contribute to this in the majority of enterprise use cases.

Our aim is to help you manage and control the chaos out there so you can use your data to make things happen. Most current tools for such tasks are highly structured. They are systems that won’t adapt and they won’t let you move forward. These days data is very messy, so Splunk is aiming to let you work the way your data works, in a messy data way where you don’t have to try and structure it first.

Why? These two are good exemplars of how subtle, controlled use of AI -where the `A’ really does stand for `Augmented’ – can take the increasing complexities of a subject and start to make it fly. Here is where the ‘T’ of IT has started to be refined and enhanced to the point where it can really open up the exploitation and value of the `I’ of IT. One of the underpinnings of the real value of analytics is that the management of increasingly complex business processes carries on unhindered. This is particularly when an increasing amount of that management problem is being created by the potential for chaos that can come from the shear volume of unstructured data now being generated, and its rate of growth.

This brings with it a problem for many businesses. The rate of growth in data, and unstructured data in particular, is now so fast that there is an equal growth in the availability of people with the right management skills to go round. Tools like this will become the only way that companies can hope to make some sense of their data.

Dynatrace Perform 2018 – from APM to the bleeding edge of Biz/Ops Management

Splunk jumps into Biz/Ops to make some sense of the chaos

(2) Using Online as retail theatre

One of the problems Johnson has had to face in this is trying to transfer the customers’ in-store experience of what he calls retail theatre’ over onto the website.  An example can be seen in the store’s famous food halls, where the bakery department has been completely revamped and all those smells of bread and cakes baking are exploited as part of the experience. In the absence of any internet-enabled ‘Smelly-Vision’ technology, getting some of this experience across on the website is the type of challenge he faces.

Why? It is not often that I get to write about a user’s experiences in the world of retail here in diginomica, and when I do it often comes down to tech-related issues such as `why do you use that for this task?’ and `how do you make it do the job?’ But I feel this one really gets to the heart of an important, non-tech issue in retail – the more important task of identifying, exploiting and expressing what the retailer’s brand values are really all about, and how the tech can capture and display the `theatre’ of those values.

As an example Harrods is just great, for the point is made: here we have a shop (OK it takes that notion a bit further than your local corner store) that is one of the top three visitor attractions in London. As such its theatrical value is as the pinnacle of high-end retailing. So customer experience, even at the other end of the world via hundreds of internet hops, is crucial. It has to offer the ultimate in customer experience.

The company understood it had to re-position its initial website approach, away from the gift basket and souvenir business and towards a way of creating as much consistency as possible between the website and the store. As a business, it needs to keep faith with those visitors once they are back home and wanting to carry on shopping, as well as give those who never make it to London’s Knightsbridge area a shopping experience that closely matches that of attending the retail theatre of dreams.

Online as retail theater at Harrods

(3) Opportunity knocks for MSPs, if they care to listen

Some of the early cohort of MSPs may have got it, but many have long term contracts with customers, so their interest is primarily in making more money from those same customers…………..But we are not really seeing the MSPs develop new stacks in order to provide new services to new customers. That is the opportunity for them now: services such as Desktop as a Service, Disaster Recovery as a Service are obvious candidates.

Why? This is an issue where rephrasing the question as `why the hell not?’ is at least as valid. This has been the year when most vendors and customers – and indeed analysts – have discovered the notion of the hybrid cloud as something new. In fact, of course, it is as old as email, where the vast majority of businesses have used a third party service provider for years. In the end, it is all just `cloud’.

Where the difference comes is just that issue, what type of service does a user require and how do they access it so they can consume it. That access is not going to be just the major cloud service providers such as Amazon AWS or Microsoft Azure, they will not have the resources to manage tailored service provision down to the level of any but the very largest of global customers.

But the vast majority of businesses will have a use for service providers that can pull together and loosely couple the specific applications and management tools that those business users require. MSPs have been doing that kind of business for years, if only on a very parochial, `peasant farmer’ level. But there is now an opportunity for them to grow and follow the same model as the Value Added Resellers that emerged in the wake of the Personal Computer and the small server systems that it spawned. They thrived because so many of their customers a) needed computers, and b) had no idea, and no real need to know, how to work them.

The same can now be said about the relationship of many similar companies to the cloud. They need some level of service to take the aggravation of getting active in the cloud off their backs. It is worth good money to them, because they can make more than they pay out, once they have started exploiting the possibilities. And if the MSPs don’t have the gumption to move into the space ready-made for them, rest assured many others will.

Opportunity knocks for MSPs as cloud becomes ‘on-demand ubiquity’?

(4) Re-intermediation and the Algorithm Economy

I’ll make a further point, which I think puts a bow on it. We’re at a point now where we’re helping customers figure out which algorithm to use, and we let the software tell them that. So they don’t necessarily have to know.

Why? AI is certainly this year’s favourite flavour but the underpinning, the keys to making it all possible, are the algorithms that make it all possible. These are getting more varied, more complex, and more subtle in the capabilities they can offer users. It is also changing the way that users perceive their data and how they set about exploiting it. The notion of data having a `gravity’ is taking hold - the idea that a Petabyte is a weight as well as a number. There is a realisation that there is much to be gained by using algorithms that can work where the data resides, rather than wasting the effort and resources of shifting it to where the compute function lives.

It also means that there is a new economy growing up around those capabilities – not only in terms of the new businesses developing the algorithms, but also for the users. They are starting to find whole new services that they can offer their customers, opening up new markets filled with businesses and consumer end users that can exploit and/or enjoy the results.

Tibco talks re-intermediation and the Algorithm Economy

(5) Data as the new 'oil'

In essence, the real sexy stuff, and the stuff that makes the most money by a country mile, will not in future be found in producing new technology or new applications. It will come from using those tools to squeeze every last drop of insight, possible trend or merest whisper of a prediction out of the vastest of vast accumulations of data it is possible some businesses to now hold and manipulate.Till ow business applications have never really changed – they have been systems of oppression. But we are now on verge of systems that are fundamentally different, that can give you guidance and advice.

Why? Oil has driven the growth of the global economy for at least 150 years, and is liable to continue to play a role for a good few years yet. But there is now a new kid on the block of global economy driving – data. It is not that new in practice, but the tools now available to us to manipulate it have turned it from being a simple tool of steam hammer brutality – the bigger the numbers the more statistically significant the result – to a highly complex set of sophisticated tools of ever greater subtlety and precision. And it is that change which makes data the single most important engine for the future of economic development.

What is even more important is that, till now, it has only been the globally significant businesses that have been willing and able to pay the eye-wateringly large amounts of good money to get their hands on such results. Now, tools are starting to appear that either and/or cut the cost of buying and using them, or change the economic model completely by being available to use on an as-needed basis from the cloud, liberating its underlying meaning as an exploitable resource while also protecting the integrity and security of the data.

This also brings a fundamental change in the possibilities available to users. Not only has data tended to be big and crude, it has also tended to focus on the known knowns. But now users can start to address the classic Donald Rumsfeld paradox: `we don’t know what we don’t know’. With customer interactions now coming in anonymously, businesses can start to find and connect all the dots about them. Those `dots’ can now be about markets, products and their lifecycle, or services, or any combination of the three.

If data is the new oil, what price trust?

Microsoft pushes the advantages of making data common

(6) AI = money (for sure) but also the law of unintended consequences

Hardly 'hot news’ of course, and almost into the realm of being blindingly obvious, but when I get yet another press release from a vendor in the AI space asking me to tell you, dear reader, to get on with buying AI products (preferably theirs, of course), all I can envisage is the emergence of AI horror stories late this year when the law of unintended consequences starts to have a field day on some early implementations. Early adopters often get bitten.

But at present no one can be sure that the AI has all the right information for the job because no one can be really sure what the ML systems need to be trained to look for. And while we all think that is obvious, it is but only up to a point. Our near term future, at the very least, is made up such developments, but we are still in the early-learning phase of how to build the ‘bricks’ properly, let alone how to put them together to make ‘walls’.

The new model here is that the question now goes to the source of the answer, like Greeks visiting the Oracle of Delphi. With this in mind, and given the continued exponential growth of data volumes it is possible to see that estimates of cloud services – currently estimated to be a $50 billion marketplace – growing to $250 billion in fairly short order may not be overly optimistic.

Why? Well, let’s face it, Artificial/Augmented Intelligence and Machine Learning are THE hot topics of the year, and will no doubt continue to be so for a good few years yet. I am certain will not be the only diginomica correspondent to touch on the subject at this time.

It is also going to be the source of significant amounts of revenue, globally, for both those vendors that produce the technologies and/or bring them together in complex configurations, and for the users that get to make the most of exploiting those technologies effectively. Numbers in the $trillions are now being predicted.

But those words, ` exploit those technologies effectively’, carry with them a heavily implied danger – getting it wrong. These three pieces are not, of themselves, AI Horror Stories, but they do collectively balance the financial potential against risk areas, particularly the law of unintended consequences, and the need to be much more than just `prepared’. You may plan an AI system to do X, but can you be sure it won’t also do Y or Z inadvertently? Can you be sure that it won’t interact with AI system A to produce B and Z working against each other simultaneously? This is the alter ego of the Rumsfeld Question: `Do you know, that you can know?’

So the advice carried here is make sure you know as much as possible, seek as much advice as possible, go and play with AI with some experts. There is certainly gold in them there hills, but there is also a fair degree of excremental misadventure for the unwary.

Sitting in The Dock of AI

Just another (AI) brick in the wall

Data + cloud + AI = $1 trillion

(7) To the edge, and beyond

Such storage developments bring, right out to the edge, fast, local access to data for the types of data-intensive applications that are currently the preserve of the traditional data centre. Indeed, as pointed out by Intel’s Alex Quach in his recent discussion with me on the relationship between the coming of 5G mobile comms and the development of distributed, virtualised data centers the biggest problem enterprises now have is their inability to shift vast volumes of data back to a singular, physical data center at speeds which make managing operational and business processes remotely practical.

You could get little shacks hosted at the edge of the network where you have big servers but require them in a small form factor and consuming low power. So we are doing more integration around the capabilities that are in a data center, where you start integrating I/O, storage and compute into smaller packages.

Why? Developments in semiconductor technology are about as esoteric as it is possible to get, and normally well off the radar for diginomica. But some recent developments are key parts of a wider shift that is set to change the face of the data center forever. Sometimes, it is worth keeping an eye on what they are doing with the atoms in the nanometre world, for sometimes that will rattle many cages in many a-CIO’s office.

Developments in both storage devices and chips that go to support the functioning of 5G mobile communications are together creating an environment where the ability to put powerful compute resources out where data is generated is not only possible but also makes a great deal of sense. The edge of the network is about to become as much a part of the corporate data center as the big, power consumptive edifice.

The reasoning is simple, if it is possible to process data where it is created, to the point of conducting complex analysis on it, then it will be faster and more manageable than shipping vast gobs of it back to the centre, which takes both time and energy, All that needs to come back to the center are the results of that work, with the occasional instruction on what needs to happen next making the return journey – and them only when a change is necessary; the local systems will normally know what to do next anyway.

The technology can now produce storage units that can slip into a pocket yet can hold multiple Terabytes of data. Edge compute nodes that can be handheld are already starting to become available that can sit right out at the farthest edges of a corporate network, yet are capable of performing all but the biggest and most intensive analytical tasks. Communication will start to come, from next year now, from a combination of increasing use of 5G mobile comms technology, which will be able to provide extensive, high bandwidth datacoms from any point in the network to any other point. In practice, cloud services, which will become the heart and lifeblood of dispersed, distributed and virtualised data centers.

Fancy dismembering your data center and throwing it to the four winds?

Intel starts talking real 5G data center use cases

(8) Outsourcing and growing ecosystems

Outsourcing has been tried before, though more for budgetary reasons than real business development. Now, with the cloud connecting commodity hardware of prodigious power to run applications offering significant productivity potential, all built around a growing raft of common standards, the time does seem to be right for the next step – where enterprises really start expand Product loit the resources available within and through the main public cloud service providers and forget about being the providers of the service.

Why? The acquisition, earlier this year, of the data center operations of US oil giant, Chevron, by Microsoft Azure is seen by some in the cloud services business as the first marker of what could be a major change in the way that business users view the cloud and how it is used.

This five-year deal is, in many respects, a resurrection of the old business model of outsourcing. But as this story points out the two leading lights of hyperconverged systems maker Nutanix, SVP of Engineering and Product Management, Sunil Potti, and CEO Dheeraj Pandey, the real story runs deeper.

Potti in particular sees this as the next obvious development of the sudden realisation amongst both the user community and many of the established vendors that hybrid cloud services are not And so the an option, but the only reality. From this it follows that, so long as data and applications have `freedom of movement’ between any appropriate environment or resource that is part of a user’s overall network, they should be able to do that, dynamically and seamlessly.

Potti suspects that he next question such businesses will ask is along the lines of: `why should I have to manage this stuff when my cloud service providers are already experts at that?’ And so the reincarnation of outsourcing will get under way – indeed is underway already.

Outsourcing is dead……long live outsourcing

(9) Alibaba – the Chinese are coming

But Yeming Wang is keen to point out that its operations are far less to do with its technology in play and much more about the business services it can provide. He sees the first customers for the UK operation to be businesses looking to exploit the direct connection possible between China and the UK. So he expects this to cover both UK companies with existing operations in China, or looking to create them, and Chinese companies looking to trade in the opposite direction.

This is one aspect of an underlying trend across Alibaba. Its core market has been, as with its key rival, Amazon, providing a comprehensive, all-pervasive retail platform. But one perceptible difference between them is that Alibaba is more upfront about its understanding that the beginning and end of its business is about the accumulation, deployment, analysis and monetisation of data. It does not see itself as a `technology’ company per se.

Why? Mention public cloud services and some names inevitably leap to the front – Amazon AWS, Google, Microsoft Azure – but the `big gorilla’ of the Far East, Alibaba, is now over here, and seriously pitching at making its own space at that top table. And it aims to do that with a subtle but potentially important and different sales pitch; a pitch that might appeal to the all-important small and medium-sized business sector.

The company is seen here as primarily a cloud services provider, but that cloud is just the deliverer of a range of business services that, when taken together, can provide any business, from the global to the corner store with all the IT resources they require to run their business – sales management, purchase management, banking services, logistics services and, for the growing number of small businesses entering the world of entertainment, a division commonly referred to as `Fun’. The business model therefore is to re-use (and therefore re-sell) as much of the data it generates as possible, For example, a business makes a sale, and the data is re-used to trigger stock replenishment, banking services to manage the transfer of funds to and from the business, and primes the preferred logistics provider on an upcoming delivery.

Alibaba also plans to make extensive use of channel partners to provide the broadest coverage across all market sectors, and will be using them to target the every day needs of customers buy selling services on the basis of common business terminology, such as specifying what resources are required on the basis of terminology such as `sufficient resource to stock/sell/transact/deliver N quantities of products A, B and C to Y customers per day/week/month.’

Global-bound Alibaba partners with SAP on pushing HANA further into the cloud

Alibaba Cloud expands in Europe as it aims to change the rules

(10) Using tech to tackle human trafficking crisis

I’d like to start pushing the fight further into the private space. That’s the financial sector, transportation, hotels, that sort of thing, the people who are making decisions to stop trafficking there. We need to give them the tools they need in order to do that. And we have a few tools in the pipeline right now to help do that.

Why? At one level the answer to that question is self-evident. Human trafficking is an appalling stain on the record of humanity, especially when the nations doing it – whether as trafficker or traffickee -- would lay any claim to being even remotely civilised. And that goes for the UK as much as most other countries.

But there are some other aspects to it now, aspects that might just bring significant levels of hope to those fighting trafficking on the front line, such as the subject if this story, the Global Emancipation Network. They want to make as much use of the latest technology as possible in order to identify the traffickers while they are in action. Technologies such as facial recognition tools can not only ID those being trafficked, but also those conducting the `business’. AI and big data analytics can be used to ID not only the individual traffickers but also how they operate, where the money goes and a wide range of other evidential issues.

That is why the Network is always on the lookout for sponsors/partners/helpers that can provide the technology resources required, and others that can provide the time to get the work done. Four of the biggest signed up to help are Splunk, for its system log analysis tools, Microsoft for its Azure cloud services, GitHub to secure is rather sensitive codebase, and Dark Owl for it capabilities in penetrating the recesses of the Dark Web.

Now the Network is looking for more, and different, partners, particularly across Europe where a goodly percentage of the trafficking takes place, especially among the banks that fund and trade the money involved, and the businesses – especially the hotel, restaurant, entertainment, agriculture and cleaning services trades – that employ the trafficked individuals.

The technology now exists to choke off the trade, as well as identify the perpetrators. Maybe it IS the place for every IT vendor to offer something – be it product, technology or brainpower.

Global Emancipation Network tackles human trafficking crisis with Splunk