Tagged in: Cloud

Repost: Avere Systems is Acquired by Microsoft

Big congratulations to Avere Systems, who recently announced their acquisition by Microsoft. Avere is a company I’ve followed since inception in 2008. Their technology started as an edge NAS filer, extending data centre content to branch offices. Over time, the platform became virtual and moved to the public cloud. Today, vFXT (the cloud instance of the filer), enables customers to present their data to cloud resources and use the benefit of public cloud analytics and processing, without having to commit to the wholesale migration of data to public cloud storage.

Interesting blog post over at architecting.it and shows Microsoft’s commitment to developing the hybrid cloud and its applications, check it out!



I hope you all had a the best of holiday seasons, and i should be back again soon.

Tchau for now!



Looking Forward to 2018

Looking Forward to 2018….

2017 was the year of virtualization, the days of buying dedicated hardware is coming to a close. and I can only see this continuing as the power of virtualization is clear for all to see, dedicated task orientated servers are relics from the past being phased out in all sectors.

Virtualization and the resilience and IT agility it brings are undeniable and being widely embraced.

Hardware is becoming more and more unified, with compute and storage solutions being sold as comodities, with virtualization providing the functional granularity and separation of servers and services.

Recent Developments such as Scale Computing’s HC3 Unity will start to shift the paradigm from in house private clouds to public cloud allowing public cloud to be leveraged to fill performance gaps in peak load situations – using Hybrid cloud technology, Which for me is a very exciting prospect, negating the need for over-provisioning available compute resources as organizations can worry less about peek load than average load when buying compute hardware.   Knowing that their private cloud infrastructure has seamless integration with the public cloud. What IT department inst going to jump at the chance of spending less on hardware and gaining redundancy in the deal!

“Throughout 2017 we have seen many organizations focus on implementing a 100% cloud focused model and there has been a push for complete adoption of the cloud. There has been a debate around on-premises and cloud, especially when it comes to security, performance and availability, with arguments both for and against. But the reality is that the pendulum stops somewhere in the middle. In 2018 and beyond, the future is all about simplifying hybrid IT. The reality is it’s not on-premises versus the cloud. It’s on-premises and the cloud. Using hyperconverged solutions to support remote and branch locations and making the edge more intelligent, in conjunction with a hybrid cloud model, organizations will be able to support highly changing application environments” –said Jason Collier, co-founder at Scale Computing.

“This will be the year of making hybrid cloud a reality. In 2017, companies dipped their toes into a combination of managed service providers (MSP), public cloud and on-premises infrastructure. In 2018, organizations will look to leverage hybrid clouds fororchestration and data mobility to establish edge data centers that take advantage of MSPs with large bandwidth into public clouds, ultimately bringing mission critical data workloads closer to front-end applications that live in the public cloud. Companies will also leverage these capabilities to bring test or QA workloads to burst in a MSP or public cloud. Having the ability to move data back and forth from MSPs, public clouds and on-premises infrastructure will also enable companies to take advantage of the costs structure that hybrid cloud provides.” – Rob Strechay, SVP Product, Zerto


The Big challenge for IT admins now is designing their virtual infrastructure around their organizational needs. for me there are 2 main challeges to this, designing the virtual networks to optimize performance of virtualized services, ensuring the virtualized servers can comunicate with their storage effectively, and that IO operations are optimized for their needs, across enterprise storage.

2018 Must continue this trend,  but this will bring its own challenges, in order for the hybrid cloud to be able to offer true seamless integration of private and public clouds, the performance critical services such as databases are hosted predominately in-house with optimized storeage

“From a storage perspective, I think what will surprise many is that in 2018 we will see the majority of organizations move away from convergence and instead focus on working with specialist vendors to get the expertise they need. The cloud will be a big part of this, especially as we’re going to see a major shift in public cloud adoption. I believe public cloud implementation has reached a peak, and we will even see a retreat from the public cloud due to hidden costs coming to light and the availability, management and security concerns.”  – Gary Watson, Founder and CTO of Nexsan

“2018 will be the year that DR moves from being a secondary issue to a primary focus. The last 12 months have seen mother nature throw numerous natural disasters at us, which has magnified the need for a formal DR strategy. The challenge is that organizations are struggling to find DR solutions that work simply at scale. It’s become somewhat of the white whale to achieve, but there are platforms that are designed to scale and protect workloads wherever they are – on-premises or in the public cloud.” – Chris Colotti, Field CTO at Tintri

Disaster Recovery is another great benifit to the the hybrid cloud, you can have an exact replica of your in-house cloud on the public cloud ready to failover at a moments notice, Natural Disasters need not take down public facing services, and must be taken seriously in this ever changing world…


It’s 2018 and computers are fundamental to the core of buisiness worldwide, theres no debate and its been so for a long time. but that doesnt necessarity mean its being done in the best way…

“In 2018, we will continue to hear a lot about companies taking on this journey called ‘digital transformation.’ The part that we are going to have to start grappling with is that there is no metric for knowing when we get there – when a company is digitally transformed. The reality is that it is a process, with phases and gates – just like software development itself. A parallel that is extremely relevant. In fact, digital transformation in 2017, and looking ahead to 2018, is all about software. The ‘bigger, better, faster, cheaper’ notion of the 90s, largely focused around hardware, is gone. Hardware is commoditized, disposable and simply an access point to software. The focus is squarely on software and unlimited data storage to push us forward. Now the pressure is on the companies building software to continue to lead the way and push us forward.” – Bob Davis, CMO at Plutora

“The total volume of traffic generated by IoT is expected to reach 600 Zettabytes by 2020, that’s 275X more traffic than is projected to be generated by end users in private datacenter applications. With such a deluge of traffic traversing WANs and being processed and stored in the cloud, edge computing — in all of its forms — will emerge as an essential part of the WAN edge in 2018.” – Todd Krautkremer, CMO, Cradlepoint

“As enterprise data continues to accumulate at an exponential rate, large organizations are finally getting a handle on how to collect, store, access and analyze it. While that’s a tremendous achievement, simply gathering and reporting on data is only half of the battle in the ultimate goal of unleashing that data as a transformative business force. Artificial intelligence and machine learning capabilities will make 2018 the year when enterprises officially enter the ‘information-driven’ era, an era in which actionable information and insights are provided to individual employees for specific tasks where and when they need it. This transformation will finally fulfill the promise of the information-driven enterprise, allowing organizations and employees to achieve unprecedented efficiency and innovation.” – Scott Parker, Senior Product Manager, Sinequa

Now My Phd was in molecular informatics, and our research group was dedicated to providing novel aplications for computing, processing data in order to extract principle factors, and this to me seems to have strong parallels with this concept of Digital Transfomation and will be something i follow closely this year..

Happy 2018 to you all,

Thanks for reading

Tchau for now…


It Begins…


Phew, now that’s out the way, I was expecting writers block, but hey, no!


Who is this fool you ask?
Well let me introduce myself, My name is Phil Marsden and I have been invited by @rogerlund to start blogging my experiences, well at least my experiences in the world of cloud computing and virtualization. To that end I think It’s only fair that you should be familiar with the guy currently occupying your eyeballs. I met Roger through a shared love of photography and it is true, photography has become somewhat of a passion of mine, but that is unimportant for now.

What is important is I love technology and what technology can do for us, I could use a photography analogy here, old school photographers, shot film and developed their own film in a dark room patiently waiting for the developing and fixing chemicals to fix the image. They then take this fragile negative and lovingly enlarge and process a print, a process in the most professional of labs, for “instant news” may take an hour or more. Today we shoot digital, and have a 15 + Mb data dump RAW from every shot and load it and develop in digital darkrooms in a matter of minutes, not only can we replicate old systems much faster, technology allows us to go much further than ever before. Part of my love for technology comes from my background, I’m nearly 40 and I grew up in a nice southern English market town where I was always a bit of a nerd at school, never enjoyed school very much but I did well at it, and finally at the ripe old age of 18 I shipped off to university. I studied chemistry at university, which again was fascinating, and I did well at it understanding and growing in the process, but nothing really grabbed me and shouted this is what you want to do!!
In the final year of my degree we had a module called computer aided-drug design and I was hooked, hooked by the concept of a computer being able to model a chemical process. calculating a chemicals properties and interactions with other properties we could virtually screen compounds in silico or design the perfect compound for a drug to act on a specific target. Now this process wasn’t perfect, in matter of hours, or days or weeks we could come up with something that through traditional methods would have taken years with dozens of chemists synthesizing and testing compounds. In short, we could get faster and better, thanks to computers.

I was so hooked, this took me on to do my PhD at Cambridge in molecular informatics, where my project was very cool, and I got to play with stereo 3D visualization long before 3d was even in the cinema. Even in the sea of intellectuality that was the theoretical and computational chemistry department at the university of Cambridge, I was still a nerd, when it came to computers, as I gamed, and I was a poor student, I’d build my own computers, I frequently overclocked them too far and broke components, so I was always fixing something, and that carried over into my professional life and I was frequently the person asked to help out when workstations did funny things so I’d help out after (finally writing up my Ph.D.) I had nothing planned and the head of the group asked me if I wanted to stay on as a computer officer (now we are getting somewhere) and very quickly I realised it was the challenge of learning things coupled with the use of technology that gets me going!

When I joined the group IT was basically firefighting, all users had root or admin on their workstations and frankly it was a nightmare, there were network policies that defined if personal machines were allowed on the network, so this department of 2500 registered machines and around 10k registered users had a team of 7 computer officers managing everything, OS installs, upgrades, application installs, machine meltdowns network infrastructure maintenance and socket patching. It was all firefighting, there was no man power for development and departmental infrastructure.
Now I came on as an assistant to the computer officer in the Unilever centre (hello Charlotte!) and Charlotte to my eternal gratitude, instead of using me as a personal slave assistant in a very hectic work environment (which for future reference contained a “training area” of 25 identical workstations for us to host events and workshops.) decided she wanted me to manage the training area. The training area was frequently reinstalled and having various packages installed for certain workshops. Indeed, one of my first tasks was to install office 2000 on each machine, amazingly we only had one cd and it involved going to each machine in turn, powering up, logging on inserting the disk and installing office – 25 times!

Now I’m not the brightest, but even to me it seamed there must be a better way. Charlotte sat me down and said she wanted me to go on a training course for Active Directory, as she had done one “a while ago” and was “fairly sure it had the answer”. And boy was she right!

The next couple of years of my life were spent implementing a AD infrastructure in out little sub department, and as our carrot to entice the user into letting us manage their machine was a little fileserver I knocked together out of recycled bits kicking around, IIRC it had a 6TB raid 6 array. On which I gave the user space mapped as a network drive. And an assurance that any data put there needed 3 disks to die simultaneously for their data to be lost.
Over time this grew, and when charlotte went to follow her passion of marine biology and her job became available I was successfully recruited, and the domain slowly grew research group by research group. The domain also only grew thanks to a cohesive strategy emerging within the department, which resulted in departmental resources, which we spent on servers. Partly because the domain had proved its worth both in terms of network security and machine maintenance and resiliency of user data and now justified departmental resources. When we first set up the domain we used old workstations, probably 1Ghz Celerons as Domain Controllers and made sure we had enough for replication to keep us safe.

Now with investment we moved onto virtualized servers where each virtual server would be hosted on a pair of real servers with network mirrored hard drives, with automatic failover etc, System uptimes from that point on were just great 99.9% upwards.
No whilst at the university I met a lovely Brazillian doctor and we fell in love and got married, and in time my eldest son was born. Well circumstances led us down a path which resulted in us moving to Brazil and my becoming a stay at home dad.
Fast forward 7 years I was talking to my family at lunch about this opportunity to be blogging this today, and my son ended up asking “what is cloud computing?”

That’s pretty much where I am, I was a sysadmin 7 years ago I’m familiar with the concept of virtualization, I’ve run virtual machines in the real world. I’m a bit out of touch at the moment, but I am now at a point in my children’s life that I have more time to develop and a brain that still wants to learn
I eagerly accept Rogers invitation to learn about the VMware platform and blog about my experience.

Thanks Roger!



Uila – A view into your data center.

I recently had the chance to get a product overview of Uila.


What product does Uila have?


“Uila’s Application-Centric Infrastructure Monitoring Monitoring helps align business and IT Operations goals in a single product by providing IT Operations with the application visibility and correlated network, compute and storage insights for Private, Public and Hybrid Cloud-based Data Centers (such as VMWare, Amazon AWS, Microsoft Azure, Google Cloud, Docker Container, etc.).  “


Ever need to see why your developer or application owner are indicating your virtual environment is causing performance issues to their application?? Of course, it’s not VMware, but it couple be something between the virtualization layer and the application layer causing performance issues.





Sounds Great. But what does that mean? How does it work?





How do I see into my environment ?




Uila Shows real time, and backwards in time so you can click and dig in.



Then you can dig into Application Analysis to see what is causing the problem.



Once you find the problem, in this case Oracle_11g-n1 looks suspect..  You can dig farther in.




Of course you can do a root cause view.


And find the problem.





For more information on Uila.




Tech Field Day Presentation by Uila – Company Overview

Tech Field Day Presentation by Uila – Application Visibility and Root Cause

Tech Field Day Presentation by Uila – Infrastructure Monitoring

Tech Field Day Presentation by Uila – End User Experience Monitoring

Tech Field Day Presentation by Uila – Virtual Network Monitoring


Data Sheet


Check them out.




Roger Lund

NetApp Insight 2016 – General Session – Day 1

Today I am joining you live from NetApp Insight 2016.

This is my best attempt to keep up with the keynote as it happens, Pardon any missed content.
Roger Anderson joins us on stage after a nice media presentation, “who said data wasn’t exciting”

He talks about the value to both Customers and partners, and that they have made this event something special.

And welcomes us to #netappinsight 2016.

The conversation turns to the event, and how it is a netapp milestone, and now with feedback, less general sessions, more breakout sessions and labs than every before.

Roger Introduced George Kurian. He Talks about Digital Transformation, and that Data is the currency of the digital economy.

He goes on to talk about Netapp’s Success in Data Management in the last 20 years.

He brings up Simplify, And Modernize IT in Infrastracture , Scale and Accelerating Business Performance. Examples are SSD, Scale out, and Solidfire.

He talks about Data Silos , and how they Hinder the Data Center, and how the competition focuses on technology that hinders transformation. Netapp’s Data Fabric Enables IT’s Digital Transformation.

He goes into Transforming Netapp to serve you in the New Era of IT. by Serving more device customers with expanding technology innovation.

Next George is Addressing how customers want to buy and consume solutions by embracing the partner ecosystems.

He indicates that Netapp is the fastest Growing San vendor.

He talks about More and More Customers and Services Providers use Cloud because of Netapp Cloud Strategies.

He Introduces the 2016 Innovation Award Finalists. Mercy Technology services took the stage and talked about the success they have had with Netapp and SSD.

He leaves us with one thought. Challenge us to to make us better. And Welcome to Netapp Insight.

George introduces Joel Reich. Joel talks about yesterday’s Data Management, and how it has changed in the last 50 years.

He talks about Data, and the Power of it. Data is the new oil. The Netapp Data Fabric, is the new bank.

The solution to catching in on your new data, is data fabric.

Joe introduces the new all flash FAS AFF A700, and AFF A300. New Hybrid Models FAS9000, FAS8200, FAS 2600.

Joe Shows off the Netapp Flash Platform, with EF, AFF, And Solidfire. Next they plan a video on Solidfire.

Joe talks about Oncommand Insight and the power of Analytics.

Next they close with a demo of Snap Center, Snap Mirror, with AltaVault to the cloud. Next they backup to Storage Grid, and then Azure. Then they show how they can use Data Fabric to restore files from Amazon to your Data Center.

Next they talk about Data in Space, and NASA, and the JPL Data Center. They talk about how they use data transformation from file to object and back, with security. They use Cloud Sync, and do a demo. This allows File to Object, and back Object to file.

Cloud Sync is onthe Amazon Market Place.


Now Henri Richard takes the stage.

He talks about Partnerships, and Introduces Samsung. Samsung talks about trends and the 2.2 connections for each human on the planet. He is talking about the massive amount of data these devices are making.

Samsung talking about 3d NAN, 16TB SSD, and the newly announced 32TB SSD’s. They talk about Z-NAD Z-SSD.

Netapp AFF8020, and Multi-Stream , Custom Firmware for Improved NAND, Endurance.

He closes talking about the History of Samsung and Netapp Innovation.


Next up is Cisco. They play a video featuring Flexpod. Cisco Takes the stage, and they talk about the partnership, and flexpod. How they have 7300 flexpod customers, 1100 partners, and more.

Next they discuss the Digital Transformation, and the driving changes in IT. They detail how Application Architectures , Operational Models, and Consumption models need to change , and we are living those shifts today.

Next Cisco talks about the Data Center Strategy, and how they have making strides with a Programable infrastructure , policy driven platform. They talk about Solidfire and Flexpod, and overall customer simplicity.

Next up is Fujitsu. They talk about the Human centric innovation.







Roger Lund

Oracle Buys Ravello Systems

Big news to anyone following Ravello Systems. Revello has agreed to be acquired by Oracle.


from https://www.ravellosystems.com/blog/oracle-buys-ravello-systems/

I am thrilled to share that Ravello Systems has entered into an agreement to be acquired by Oracle. The proposed transaction is subject to customary closing conditions. Upon closing of the transaction, our team will join the Oracle Public Cloud (OPC) organization and our products will become part of Oracle Cloud. We believe this agreement will accelerate our ability to reach more customers, deliver more value, and enhance our technology at an accelerated pace in order to better serve you.

Thank you for your continued support. I want to emphasize that our top priority is ensuring an uninterrupted service and seamless experience for you and all of our customers and partners. Rest assured, Ravello’s service will continue “as is.” In the coming months, we will be working to continue enhancing our value to you and we are looking forward to developing new products and services enabled by this combination.

Oracle Cloud offers best-in-class services across a full suite of products in software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). Ravello will join in Oracle’s IaaS mission to allow customers to run any type of workload in the cloud, accelerating Oracle’s ability to help customers quickly and simply move complex applications to the cloud without costly and time-consuming application rewrites.

Please do not hesitate to reach out to me or anyone at Ravello if you have any specific questions and corporate information can be found at http://www.oracle.com/ravellosystems.

Thank you,

Rami Tamir
Ravello Systems

Huge news indeed. I hope the entire team at Ravello the best of luck at Oracle

No news on how this will impact Revello’s support of VMware products on it’s cloud offering.



Roger Lund