Home Datacenter Predictions for 2019, Part 2

Predictions for 2019, Part 2

by Phil Marsden

Part 2, continued from here 

AI/Machine Learning:

Alan Conboy, office of the CTO, Scale Computing

“In 2019 Artificial Intelligence (AI) and Machine Learning (ML) will nearly reach its full potential by connecting and processing data faster over a global distribution of edge computing platforms. AI and ML insights have always been available, but possibly leveraged a bit slower than needed over cloud platforms or traditional data centers. Now we can move the compute and storage capabilities closer to where data is retrieved and processed, enabling companies, organizations and government agencies to make wiser and faster decisions.

We’re already seeing this in the way airlines build and service airplanes, government defense agencies respond to hackers and how personal assistants make recommendations for future online purchases. This year, thanks to AI and ML, someone will finally know if that special someone really wants a fruitcake or power washer.” – Alan Conboy, Office of the CTO, Scale Computing

Scott Parker, director of product marketing, Sinequa

“While there has been so much hype around machine learning and AI, the biggest surprise from 2018 is how much of it is still being tested in the lab. There is an abundance of marketing around the related technologies, but, with a few exceptions, AI projects remain in the lab waiting for compelling use cases to come along.” – Scott Parker, director of product marketing, Sinequa

Setu Kulkarni, vice president of corporate strategy, WhiteHat Security

“As AI and ML become mainstream, a new breed of security data scientists will emerge in 2019: AI and ML techniques are data dependent. Preparing, processing, and interpreting data require data scientists to be polymath. They need to know computer science, data science, and above all, need to have domain expertise to be able to tell bad data from good data and bad results from good results. What we have already begun seeing is the need for security experts who understand data science and computer science to be able to first make sense of the security data available to us today. Once this data is prepared, processed and interpreted, it can then be used by AI and ML techniques to automate security in real time.” – Setu Kulkarni, vice president of corporate strategy, WhiteHat Security

Bob Davis, CMO, Plutora

“In software development, the big story in 2019 will be machine learning and AI. In the coming year, the quality of software will be as much about what machine learning and AI can accomplish as anything else. In the past, delivery processes have been designed to be lean and reduce or eliminate waste but to me, that’s an outdated, glass-half-empty way of viewing the process. This year, if we want to fully leverage these two technologies, we need to understand that the opposite of waste is value and take the glass-half-full view that becoming more efficient means increasing value, rather than reducing waste.

Once that viewpoint becomes ingrained in our M.O., we’ll be able to set our sights on getting better through continuous improvement, being quicker to react and anticipating customer’s needs. As we further integrate and take advantage of machine learning and AI, however, we’ll realize that improving value requires predictive analytics. Predictive analytics allow simulations of the delivery pipeline based on parameters and options available so you don’t have to thrash the organization to find the path to improvement. You’ll be able to improve virtually, learn lessons through simulations and, when ready, implement new releases that you can be convinced will work.

Progressive organizations, in 2019, will be proactive through simulation. If they can simulate improvements to the pipeline, the will continuously improve faster.” – Bob Davis, CMO, Plutora

Edge Computing:

Alan Conboy, office of the CTO, Scale Computing

“According to Statista the global IoT market will explode from $2.9 trillion in 2014 to $8.9 trillion in 2020. That means companies will be collecting data and insights from nearly everything we touch from the moment we wake up and likely even while we sleep. In 2019, edge computing will require a new level of intelligence and automation to make those platforms practical. Where once only a smidge of data was created and processed outside a traditional data center, we will soon be at a stage where nearly every piece of data will be generated far outside the data center. This amount of data will create a permanent home for edge computing.” – Alan Conboy, Office of the CTO, Scale Computing

Bill Miller, CEO at Axellio

“Next year, edge computing will drive companies to re-invent themselves to take advantage of this growing market. The demand to capture and analyze data at the source will drive strong growth in the hybrid cloud market including cloud services like Microsoft Azure and Amazon AWS, which are working to make their stacks available to run through on-premise micro data centers. We know that these workloads don’t run well in the cloud, so companies Microsoft and Amazon will likely adapt their cloud services to run on the edge.

We can also expect to see cable and telco companies reinvent how they use their switching-infrastructure to join the edge-computing market. Telco companies have thousands of headends used to process television signals, which can be repurposed to create low latency, edge-computing services for their customers. Between the cloud providers and these cable companies/telcos, we could see some pretty interesting business moves to adapt to the edge-computing market.” – Bill Miller, Axellio CEO


Neil Barton, CTO, WhereScape

“Over the next year, the number of companies that will harness IoT capabilities in a broader range of context will continue to expand rapidly due to technological advancements producing sensors that are smaller, cheaper and more effective. The challenge is no longer in the tech, but in the value organizations can extract from the data they collect, which can be like finding a needle in the data haystack. Now, only a year away from Gartner’s prediction that there will be 20 billion internet-connected things online by 2020, there will be a data revolution as IT teams look to incorporate these new data sources into existing analytics environments and make insights quickly and easily accessible to the business.

Due to the sheer volume of connected devices to become the norm in 2019, the only realistic solution is to embrace automation to ingest, transform and deliver real-time data and insight in a way that the business can use. Data automation can ensure that IT teams, whether old or new to working with streaming data, can absorb the astronomical volume of data on the horizon and be in a position to leverage its insights quickly,” said Neil Barton, CTO at WhereScape.

Stephen Gailey, solutions architect, Exabeam

“2019 will see a continuation and probable acceleration of the IoT trend with more and more devices becoming smart and getting connectivity. This inevitably will see more hacks and more botnets of previously inert designs. AI turning on human kind is likely to be led by fridges and smart doorbells rather than by Cyberdyne Systems Hunter Killers!” Stephen Gailey, solutions architect, Exabeam


Jeff Keyes, director of product development, Plutora

“2019 will be the year of up-leveling in the development process so that teams and organizations can maximize the efficiencies and benefits of their value stream. It’s not about tools – that battle is over. We’re concerned now about what comes after the tools are in place. The technologies we have at our disposal are highly advanced and perform cutting-edge tasks. But they are still tools and anyone can buy a tool, take it out of the box, put it to work and get some level of results. Where we can really make strides going forward is efficiently delivering the most value to the customer and clearly demonstrating the value of managing the people, processes and technologies of software delivery pipelines.

The goal for all organizations in the year ahead should be understanding how to demonstrate to customers the key steps in the process while also showing which steps actually offer any real value to the process. The business landscape will become increasingly competitive as organizations scramble to work faster and cope with growing customer demands. However, with speed often comes wasted time and efforts caused by haste. Organizations will have to learn to take a step back, look at the big picture and identify areas for improvement as well as those not integral to the process.

Also, any time you solve a problem, don’t think it’s solved for good. Organizations will solve and overcome problems in 2019, but customers’ needs and expectations will also evolve, so the management of the value stream will have to as well.” – Jeff Keyes, director of product development, Plutora

Claire Giordano, a VP at Citus Data

“For 2019, I see a year full of Kubernetes. Kubernetes is going to make all things DevOps much easier. The Kubernetes ecosystem is inching towards a world of more robust database and storage offerings. Next year, I feel we will have numerous benefits from Kubernetes, but most importantly, we will have improved access as well as security controls.” – Claire Giordano, a VP at Citus Data


Ziv Kedem, CEO, Zerto

“In 2018, we saw organizations dreaming about making new services ‘born in the cloud,’ but facing challenges getting the data there with minimal disruption. In 2019, we’ll see more business begin to make this dream a reality.

Protecting workloads once they’re in a public or multi cloud environment can often be a big challenge that people just don’t think about. It’s easy to assume that once your data is out there, it’s already safe, and the platform won’t go down. However, this simply isn’t the case. Businesses need to be actively looking to protect and monitor data wherever it is, and understand what role mobility can play in this. As workloads needs change, businesses will need to make sure they have a way to move data to, from and between different infrastructures without interruption, to ensure the organization is able to seamlessly adapt and protect itself.

It is important not to make downtime vulnerability an ‘acceptable’ risk, in order to focus on continuous development, speed and performance. We hope that, in 2019, more business leaders will opt for technologies that provide assurances that all workloads, data, and applications are protected no matter where they reside, where they are moved to or what outside influences impact them. Achieving this, without slowing the pace of business transformation, will be the great technology challenge that business leaders should look to overcome in 2019,” said Ziv Kedem, CEO, Zerto.


Exciting times, as the world becomes more connected and ever more reliant on data, 2019 will im sure have many developments worth watching!

You may also like