The Shift Away From Managing Security by Compliance to Managing Security by Risk


By Kaus Phaltankar, Chief Sales Officer, Enterprise Risk Management & President, Virtustream Security Solutions

Every day, enterprises are faced with new and constantly evolving threat vectors. The bad actors have to get it right only once, while enterprises have to defend themselves continuously and get it right every time!

Enterprise Risk Management (ERM) has emerged as an important business trend that builds a holistic and integrated approach to risk management across the enterprise. ERM encompasses a number of risk areas, including Information Technology (IT) Risk, Operational Risk, Regulatory Compliance Risk, Financial Risk and Reputational Risk.

The ERM risk areas are affected by the security within the enterprise. There is a paradigm shift within enterprises to move away from ‘managing security by compliance to managing security by risk.’ This moves the organization to manage security, Information Assurance (IA) or compliance from a discreet snap-shot-in-time ‘checklist-based’ approach to a real-time ‘continuous risk management’ based approach. ERM provides stakeholders and decision makers a risk view across the enterprise with detailed risk by departments, bureaus or information systems. The risk view is available on-demand, as well as over a period of time, to see how the Information System owners and leaders are managing risks.

This risk-based approach requires quantification of risks identified by various security and compliance tools monitoring hardware or software assets to business critical applications. The National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) defines the point security technologies, such as Asset Management Systems, Configuration Management tools, Vulnerability Scanners, Security Information Event Management (SIEM) systems, as well as Governance, Risk and Compliance (GRC) tools as ‘Sensors.’ The Department of Homeland Security (DHS) Continuous Asset Evaluation, Situational Awareness and Risk Scoring (CAESARS) framework refers to the sensors in its ‘Sensor Sub-system’ for collecting data about enterprise asset risks, which are then analyzed and quantified using risk scoring algorithms. These risks per assets are aggregated by Information System(s) and by mission critical functions supported by those information systems. The critical emphasis is on continuous risk monitoring that utilizes automation using machine-to-machine information exchange through standards-based protocols such as NIST Secure Content Automation Protocols (SCAP) on a continuous basis. The continuous nature of the analysis allows information system owners to quickly assess risk on a near real-time basis and become proactive in mitigating risks based using a prioritized response.

Regulatory requirements, such as GLBA, SOX, HIPAA, PCI or the Federal Information Security Management Act (FISMA) also mandate ‘continuous monitoring’ of information systems for protecting Personally Identifiable Information (PII) using Privacy Impact Assessment (PIA), as well as monitoring of unauthorized access to maintain the integrity of data and systems.

The biggest challenge for enterprises today is to truly understand what it means to conduct continuous risk monitoring and what that entails. The key requirements of continuous risk monitoring under RMF are:

  • Close tracking of change and configuration management of assets
  • Monitoring of Information Assurance (IA) and governance controls using automated tools
  • Quantifying risk based on risk algorithms and computations
  • Document creation, updates and reporting

Enterprises need a scalable data ingest, collection, storage, processing platform that delivers an accurate and timely view of the IT and operational risks by providing a 360° view of the asset’s software and hardware inventory, vulnerabilities (VUL) and secure baseline compliance established on approved baselines. Other key attributes of the Continuous Risk Management platform are:

  1. Scalability: The continuous nature of data collection automatically imposes scalability requirements. Scalability can be imposed across data storage per size, frequency, and retention metrics. Additionally, the platform needs to demonstrate data collection flexibility and analysis scalability. It must also be deployed stand-alone or setup in a tiered architecture to accommodate distributed enterprise implementation.
  2. Agnostic Sensor Coverage: The data ingest should be configurable to support multiple sensors based on current point technologies, as well as a multitude of data inputs such as NIST SCAP, XML, JSON, CSV, and other input formats.
  3. Creating Common Operational View: The solution needs to provide a singular view of the risk by asset, by application, by department, by agency or by enterprise as a whole. This view needs to be on a single pane of glass with full drill-down and drill-back capability to view what data contributed to the overall risk at various levels of the dashboard view.
  4. Data Warehousing: A platform should provide options for data warehousing based on volume of data from SQL to a NoSQL Big Data solution. The SQL-based RBDMS data storage (e.g., MS SQL, ORACLE, DB2) provides a traditional data store for structured data, while a NoSQL Hadoop solution offers a store for structured and unstructured data with a linear store and processing scalability using a multi-node cluster solution.
  5. Monitoring Management/Work Flow Capabilities: Enterprises need a solution that controls all workflow requirements in terms of risk monitoring, compliance with baseline specifications, as well as response metrics and related mitigation steps. Built-in workflow managers should be customizable to map business processes, assign risk values and trigger alerts for mitigation actions.

At Virtustream, we offer our Analytics and Continuous monitoring Engine (ACE) for Enterprise Risk Management, Cyber Situational Awareness and Regulatory Compliance monitoring on a continuous basis. The Continuous Risk and Compliance Monitoring (CRCM) capability provides:

  • Proactive risk management using a standards-based risk management framework
  • Continuous monitoring of each asset for compliance and risk by building a 360° view of each asset within the enterprise using multitude of sensor data
  • Massive processing capability using Hadoop Big Data solutions to process volumes of data in a variety of formats quickly (volume, variety and velocity). The data could be structured or unstructured.
  • Perform threat and impact analysis using external threat intelligence from US-CERT or commercial inputs, asset configuration policy and hardening guides such as Security Technical Implementation Guides (STIGs)
  • Business Intelligence (BI) analytics and reporting capability for analyzing massive amounts of data for diagnostics and prioritizing mitigations
  • Enabling near-automated mitigation by interfacing with other tools and technologies

The ACE Continuous Risk Monitoring approach truly enables Enterprise Risk Management and allows enterprises to shift away from ‘managing security by compliance to managing security by risk.

Posted in Uncategorized | Leave a comment

Forecast: Cloudy With a Chance of Encryption


By Tom Evgey, Sr. Security Analyst, Virtustream

The RSA conference in San Francisco, CA, is certainly one, if not the largest network security conferences in the world. Rows upon rows of vendors attend showcasing the latest and hottest tools and services in the industry in an effort to convince you to buy their ultimate solution for protecting your closest assets.  Buzz-words like SIEM integration, log correlation, cloud security and forensic analysis were on almost every flyer; long lines of the security field’s finest, crowded around in search of that one solution that will put all of us “enemy of the state” enthusiasts at ease.

Perhaps, as we each head back filled with promotional material and swag, we’ll take a closer look at some of the capabilities that each of the vendors had to offer. And hopefully, we’ll find something we can bring back to our own environment and integrate it into our own security mission in an effort to proactively prevent the next big hack.

Securing The Perimeter

The common strategy has long been protecting our assets from the outside in. We  utilize firewalls, Intrusion Detection System (IDS), proxy, maybe a Web Application Firewall (WAF) and  Anti-Virus (AV), all perimeter based.

Perhaps if we’re really paranoid, we install a host-based IDS. We run our weekly or monthly scans, we have a log correlation system to collect our logs, we have HTTPS running and OTP protecting ALL of our applications, and, every once in a while, we test our ‘Incident Response Plan’ to check our readiness.  But is that really the correct approach? I don’t think anyone would argue that the companies who have been breached in the last few months were lacking any of the above methods. Still, we hear about these large-scale attacks, week after week, with a cloudy forecast for more attacks. So what ARE we doing wrong?

Our Challenge

As a Cloud Service Provider, we have a unique challenge at Virtustream.  We manage massive amounts of customer data. Our clients are all using our processor power, disk I/O, network packets and memory resources and their consumption gets reported on by the minute, based on need and environment, and all while maintaining Confidentiality, Integrity and Availability (CIA) and staying current with every compliance framework. We need to find the right balance of protecting our customer data and Intellectual Property (IP), while providing accessibility to our user base… all at a very competitive price.  It starts with a simple login to our web portal all the way to large scale cloud bursting.

Our Solution   

So, how do we protect our data? What is the missing component in our security strategy? How do we maintain the integrity of our data without losing that accessibly?  If you guessed “getting another scanner or IDS with every bleeding edge signature” or yet another firewall you should reconsider (and maybe take another quick glance at the article’s title).

The answer is encryption. Encrypting our data ensures WE have the keys to our assets, no matter where the data goes.

Here at Virtustream, we leverage the capabilities of Vormetric Encryption. Not only is the data encrypted and protected from any inbound threats, Vormetric takes ‘access as needed’ to another level. As part of a policy, it can restrict any user from accessing the data, by way of encryption. The prompt of ‘access denied’ now has a new meaning.

Final words

Encryption of the data is a must in any environment. It will be the one tool that will save the day when all other tools have failed. It will save the day when the marketing manager accepts a ‘LinkedIn’ invite which redirects to an HTTPS site hosting malware. It will save the day again when someone in finance opens a PDF file containing embedded malicious code. All of these scenarios, as I’m sure you know, happen every day. Our weakest links are always internal. Who does port knocking anymore, or IDS evasion, or tries to bypass the firewall when a simple, cheap email with a scandalous or believable subject line is all it takes to get cracked wide open?  Risks mostly come from inside and encryption can prevent the access to data, even in the case of a breach.

We live in a fast evolving environment, where everything is changing rapidly and we try diligently to stay one step ahead of the attackers. The challenge is that we have to be right EVERY time, while they only have to be right ONCE.

So, my suggestion to you is to brighten up your cloud forecast with some encryption.  You’ll be glad you did!

Posted in Cloud Computing, Encryption, GRC, Intel TXT, Mission-Critical Applications, Security, Uncategorized | Leave a comment

Report from the Trenches of RSA 2014

pete n

By Pete Nicoletti, Chief Information Security Officer, CISO, CCSK, CISSP, CISA, CCNE, FCNSP

The planets smartest security minds have descended by the thousands on the Moscone Center here in San Francisco desperately searching for their personal holy grail. They are looking through the myriad of security related things: the next generation of something; a hot new startup that could be worth billions; new tools and stuff that is cheaper, better, faster; a new market segment, built on big data and, of course, it must be cloud enabled. The vendors have spent millions. The restaurants, streets, halls, classes and hotels are packed with people who are all part of the huge ecosystem that protects our networks, our data, our intellectual property, and our way of life. And, here in this hyper-nexus of thousands of solutions, there is still just one single really good “answer” to be found.

What is that answer? The “answer” was invented thousands of years ago to protect war secrets and has been improved upon over time to be virtually unbeatable. The “answer” should be the first thing that is researched, selected, budgeted for and deployed in every environment. It should be required by every compliance framework and verified to be correctly in place by every audit. The “answer” should be the one thing that everyone who uses a credit card, or sees a doctor, or makes a phone call, or sends a picture by email, or uses a critical company-provided application should simply assume is in place protecting them.

Simply assuming that this “answer” is in place is wrong and in some cases dangerous. We have counted on Target, Neiman Marcus, Michaels, Linked In and even the NSA to protect our information and secrets and they have all failed us. They all knew and had access to the “answer”, but in every case above (even in light of all the other thousands of recent breaches) they didn’t deploy the “answer” correctly or completely.

We hear about security failures and breaches every day, but there is also an ever increasing incidence of “non-events”. Companies, universities, vendors, web sites, banks, and agencies are being breached and every security tool they have in place is failing except one. And because this one tool does not fail, these breaches are classified as non-events and don’t have to be reported, don’t make the news, don’t cause embarrassment, or end careers. They don’t even require restitution via credit monitoring protection or refunds. State breach laws give these organizations safe harbor and there are no costs or consequences to face even though they (almost) utterly failed. These organizations have the “answer”, and they deployed it correctly and it did not fail them.

Things must change. We can’t allow the largest theft in the history of the human race to perpetuate as our intellectual property is stolen away through copper and fiber and sent west to China. We should collectively rebel and refuse to pay the billions of dollars of breach costs that we as consumers all have to absorb. We should vote with our wallets and only buy our goods and services from providers that know the “answer” and know how to deploy it correctly.

So, what is the “answer”? Encryption.

You will find this “answer” everywhere at RSA: Major chip makers like Intel are building in-silicon security functions to make the process easier, faster and have less overhead. Major ERP Vendors like SAP are investing in supporting strong encryption. Vendors like Vormetric are offering extensive solutions for encryption deployment and management. Cloud software and services provider Virtustream is baking encryption in and insisting clients bring their own or buy the service.

I personally didn’t learn the “answer” without first paying a price. It was only after responding to hundreds of breaches and managing the security for thousands of companies (including having one of my own companies get hacked) did I figure it out. Now that I have though, I take every opportunity to let everyone else in on it. Case in point: The license plate on my vehicle is ENCRYPT!

So you see, I am doing my part to spread the word on encryption and make the world a safer place, now it is your turn! Encryption done right can solve some many of our critical data protection challenges, save us trillions in thefts and billions in breach costs, and virtually eliminate our failures. So, when and how are you going to ENCRYPT?

For more info on security, check out this press release from RSA entitled “Vormetric Demonstrates SAP HANA® in the Cloud with Enhanced Security” or our  joint white paper “Security in the Cloud for SAP HANA” by Intel with SAP, Virtustream and Vormetric.

Posted in Uncategorized | Leave a comment

Virtustream teams with Canara to reduce customer energy bills

By Alan Pearson, Director of Datacentres, EMEA, Virtustream

Virtustream has been at the cutting edge in Cloud technology and we continue to innovate by working with our key partners to deliver systems useful to our customers in the Data Center space. We are pleased to say our first installation of Canara Branch Circuit Monitoring Subscription Solution outside North America at our data center in London went off without a hitch. With a modular-based design combined with standard Cat5e cabling, the hardware was able to be completely installed in less than 45 minutes. The first side of the panel took only 15 minutes while the other side was completed in under 10 minutes.

Thanks to Canara & Bounds Electrical Services, we are the first Data Center in the UK to deploy this innovative technology and further manage our energy usage to pay more than lip service to green credentials in our Data Centers. Read more on our work with Canara.


With a little help from Marcos Almonaci (left), I was able to use Canara Connect to set all the circuit owners and verify our readings using his live PDU outputs (turns out that ours was much more accurate).

Posted in Uncategorized | Leave a comment

It’s a Hybrid World

By Stacy Hayes, VP Strategic Alliances & Channels, Virtustream

Cloud adoption at the enterprise level continues to increase as core functionality continues to improve.  Paramount on the list of “must haves” is the ability to seamlessly, easily and securely move application and data base workloads from one platform to another.  Additionally, customers are keen to avoid vendor lock-in, so true ability to move workloads between both public and private clouds, with no strings attached, is a must.  Workloads must also be dependably secured, with the same network and application controls in place both at rest and in flight.  Enterprise cloud consumers are still a long way from realizing this utopia but thanks to advances like Cisco’s new InterCloud, they’re a lot closer today.

Virtustream has long been a believer in hybrid – our software supports private cloud, public cloud and hybrid connection between the two.  Cisco InterCloud is the beginning of simplifying the interconnection between enterprise private sites and public clouds, enabling secure network connections and selection of public cloud connections. We view this capability as a critical component to enabling enterprise adoption of the cloud and feel that the maturity of the concept will continue to expand further in next few years.  Virtustream xStream enhances hybrid connections further by adding high levels of security, compliance and performance SLAs for production and mission critical apps.  We also work closely with Cisco and our software enhances both Vblock and FlexPod-based data centers with enterprise class private and public clouds.  We see a future of interconnected clouds linking business verticals like healthcare, manufacturing, education together with both private and public clouds.

From the Cisco partner perspective, this is a great example of the value of the Cisco relationship and the alignment they have with their service provider community.  It begins by listening to the customer through the ears of their partners.  By actively engaging their partner ecosystem, Cisco ensures, in real time, that their innovations meet the needs of their target audience throughout the product lifecycle.  As a launch participant, we have the ability to influence and shape the offering prior, during and after launch.

When it comes time to go to market Cisco really shines.  They have some of the best organized and well thought out partner programs in the business.  Service providers depend on the Cisco brand and reach to connect to the market.  In order to make that connection, Cisco created the Cloud and Managed Service Provider (CMSP) program to ensure successful end-to-end interaction from the service provider, through the Cisco field sales force and ultimately to the customer.  The cornerstone of the program is a third party, on-site audit of the service offering.  In order to become “Cisco-Powered” the provider must pass this rigorous inspection.  Cisco is not just looking to verify that their parts and pieces make up the service.  Adhering to ITIL standards, they ask for an evaluation of end- to- end solution delivery. Data centers, service desks, policies and procedures—everything is inspected in a process that takes two full days.  It’s not for the faint of heart but when it’s over, the level of assurance that both the customer and the Cisco sales force have in the service is priceless.

This is consistent with Cisco’s overall interaction with their partner community.  Unlike the other popular compute and networking vendors, Cisco’s approach to solution delivery is 100% partner-centric. Service provider offerings augment instead of compete with Cisco’s solution approach.  Because there are many partners, customers have a choice and vendor lock-in is avoided.  The competitive forces of the service provider market ensure that the highest levels of quality and ingenuity are delivered to the customer.  The ideas are fresh and customer driven, guiding not only service provider offerings but technological innovation like Cisco InterCloud.    We’re pleased to be part of the Cisco ecosystem and excited about collaborating with them in expanding the use of cloud into the enterprise market.

Posted in Cloud Computing | Leave a comment

The Cloud Market will Go Forth and Multiply, Though Not at the Expense of Quality

By Simon Aspinall, Chief Vertical Markets, Strategy, Marketing, Virtustream

2014 will see the cloud market step out of adolescence and become enterprise ready. As enterprise organisations realise the significant impact that embracing the cloud can bring in terms of productivity and agility, the market will grow. However, at the same time there will be greater differentiation within the cloud industry as businesses become more savvy around what they want from the cloud.

1. Heightened security and compliance requirements will lead to national clouds.

In the wake of this year’s revelations of the unprecedented level of NSA surveillance, the cloud marketplace will be largely focused on security and compliance. Encryption will become an ever more important attribute for cloud offerings to possess while businesses will be increasingly concerned with advanced security features such as malware/virus prevention, BIOS and hypervisor authentication.

Heightened security concerns will also dictate the types of clouds that we will see being deployed in 2014. Organisations will increasingly look to deploy techniques such as geo-tagging and geo-fencing to track the location and movement of their data. This desire to know the exact whereabouts of data will see more organisations looking to move to nationally built clouds that operate under local laws, rather than large, multi-national clouds. This does not mean however that there will be a decline in multinational clouds as there is still a huge untapped market that will be moving to the cloud in 2014.

2. Cloud market will double and divide.

The cloud market is set to double in 2014, with an ever-increasing share of enterprise IT moving to the cloud within the next five years. Hand-in-hand with this growth will come the sub-division of the cloud market. As enterprise organisations become increasingly aware of the cloud deployments that will be most beneficial for their business, we will see greater differentiation in the marketplace. Rather than the single cloud market that we have grown accustomed to hearing about, 2014 will see it divided into specialities. In particular, large public clouds are proving very popular for the development of new applications and SaaS, while we will also see the rise of specialised clouds that will support certain verticals, such as migrating traditional enterprise applications.

3. Majority of databases and ERPs will move to the cloud.

Over the next two years, the majority of databases and ERPs will move to a cloud (private/public or hybrid). We will see significant improvements in cloud performance in 2014, in large part due to improvements in cloud management platforms, and the use of flash memory and flash storage, with read/write-times becoming increasingly quicker allowing businesses to access large data sets more efficiently. Over two thirds of enterprise organisations are looking to move business-critical applications such as ERP to the cloud by the end of 2014, demonstrating an evolution of thinking about enterprise applications in the cloud. Large UK businesses in particular are accepting that moving these core applications to the cloud will bring great improvements to their organisation in terms of productivity, agility and competitiveness.

4. Hold your horses on cloud brokerage.

This growing commitment to the cloud will mean that the large majority of enterprise organisations will be using multiple clouds (private/public and hybrid) next year. With the desire for cloud on such a large scale, businesses will want integrated cloud management capabilities or multi-cloud federation. Seventy-five per cent of cloud implementations in 2014 are set to be hybrid, as businesses look to find a mix of solutions that best fit their needs. However, the enterprise is not yet ready to deal with multiple suppliers. Increasingly, functionalities such as compute, storage and network will be provided by a single interface as efforts are made to adapt to the needs of enterprise through the integration of infrastructure, but the diverse nature of clouds and underlying technologies make seamless integration difficult. This means that, while cloud brokerage will come into play at a later stage, 2014 will not be its breakthrough year.

5. OpenStack not ready to be welcomed in by the enterprise.

There will also be no major breakthrough for OpenStack in the enterprise family. While it is a great alternative for the SaaS and developer market, it is still at least two years away from having the key features to be trusted as an enterprise-ready platform (performance SLAs, security, compliance, hybrid for example). OpenStack’s current offering is very basic (mainly provisioning) but this is not to say that it will not feature in the enterprise in near future. In addition there are currently many competing flavours of OpenStack from multiple vendors(14+). This should not come as a surprise, after all, if we look at how long Linux took to mature and take hold we can appreciate that this is a natural opensource process that OpenStack is going through. Its time may well come but the rigorous demands of the enterprise dictate that 2014 will not be its year.

6. M&A on the radar.

Traditional hardware vendors (compute/memory/network/storage) operating in siloes will come under increasing pressure in 2014. Those focusing purely on compute or storage for example will struggle to make headway as it becomes clear that to make the cloud work for you, a software-based approach is needed which combines all four elements. We’ve begun to see this consolidation as Cisco, IBm, HP and others launches integrated hardware solutions. As these traditional vendors look to play catch up, we will see a flurry of mergers and acquisitions.

Similarly there are number of companies providing key functions in cloud management that will continue to ally and combine in 2014. We saw early pioneers like Savvis and Terremark acquired by telcos. In 2013 we saw the acquisition of businesses like Softlayer (by IBM), Tier3 (by Centurylink), Cloupia (by Cisco), Nicira/Dynamic Ops (by vmware) and a number of smaller businesses being acquired as major vendors assemble cloud stacks and key functions. With the majority of enterprise IT spending moving to the cloud this market will continue to grow and remain very dynamic.

We will also see plenty of innovation and investment over 2014 in the field of SaaS. A large number of early stage businesses are looking to displace traditional software vendors with new SaaS Offerings. The level of competition and pace will continue to rise in 2014

7. On the horizon.

There are new functions that are still yet to be delivered in the cloud. Improvements will be made in the fields of security, compliance, performance assurance, application monitoring and federation but all are in the early stages of development and will not be completed in 2014.

2014 will see great leaps taken forward as the enterprise ramps up its adoption of the cloud. However this is still a burgeoning industry and things will not happen overnight. Elements such as cloud brokerage and opensource may well have a part to play in the future, but 2014 will not be their year. We must remember that the cloud market is like any other and, while progress may be rapid, the maturity in an industry will take time.

Posted in Big Data, Cloud Computing, Cloud Environments, Cloud O/S, Enterprise Workload, Hybrid Cloud, Private Cloud, Public Cloud, Security | Leave a comment

We Love the Boom

Author: Rodney Rogers (Chairman/CEO, Virtustream;@rjrogers87)


“Adversity is the state in which man most easily becomes acquainted with himself, being especially free of admirers then.”
~ John Wooden

Much has been made about overcoming and learning from adversity when starting-up and building companies. When you are building something significant, it’s generally a process of applying a mind-crushing amount of work to inventing, establishing, failing, recovering, adapting, constructing a beach-head on a precious success, and then doing it all over again.

Virtustream is my third lap around the entrepreneurial track (Adjoined and Kanbay, NASDQ: KBAY). While these prior ventures were ultimately very successful due to the extraordinary efforts of many, at some point during the course of each of them, I was relatively convinced we’d fail. Spectacularly. It’s nice now to be able to talk about those experiences given the certainty of their outcomes, but going through the low-points were gut-wrenchingly miserable. In our first three years at Adjoined, we had to build a brand new technology company through the 2000 tech market crash, the tragic events of September 11th 2001, and the subsequent 2002 recession. We conceived and founded Virtustream in the midst of the 2008-2009 Credit Crisis. As an entrepreneur/CEO, one of my favorite adversity-related blog posts is The Struggle by Ben Horowitz. I can tell you from first-hand experience that Ben’s post is so incredibly accurate. There is no relief when you are going through The Struggle. None.

This blog post is not about The Struggle, per se. It’s about Virtustream’s relationship with Mixed Martial Arts (MMA), a sport that is primarily a combination of wrestling, boxing and jujitsu. It is a sport that, in my opinion, epitomizes The Struggle for the fighters in its own unique way. The need to overcome adversity in this sport is present in virtually every aspect of it.

“Everyone has a plan until they get punched in the face.” ~ Mike Tyson

A very long time ago, I played football and wrestled. I truly loved everything about the game of football: the physical and competitive intensity, the locker room camaraderie, the crowd, even the smell of the grass on game day. I loved it. All of it.

Wrestling was different. While I enjoyed wrestling, I respected the sport more than I loved it. It’s a physically grueling sport. It’s an individual sport and there is nowhere to hide when you make a mistake. There are no other players to lean on when you alone are getting your ass kicked out there on the matt. When you are cutting weight, you alone are the one sucking on ice cubes after a 3-hour workout in a plastic suit. Mistakes are punitive and they are all on you. When things go badly, it is adversity in its purest form.

In my experience, many of the strongest start-from-scratch entrepreneurs and business leaders I meet come from some sort of meaningful team and/or individual sport background. While I have no statistical correlations to offer, I do believe it gives one a certain advantageous context for dealing with the inevitable adversity one must face when building and/or running a business. It takes courage to deal with adversity, and, for most of us, it takes some time to be able to embrace the combined emotions of fear and doubt so as to channel them effectively into productive solutions, and ultimately positive outcomes.

“I was a baseball fan myself, I wanted to play baseball.” ~ Kareem Abdul-Jabbar 

Yes, I’m a big MMA fan. This is the kind of stuff I’ll find posted on our intranet from time-to-time, courtesy of my awesome Virtustream peeps:


So it’s probably not a huge surprise that we sponsor MMA fighters from time to time, mostly in Ultimate Fighting Championship (UFC) events. We are a young venture-backed technology firm and spend a very modest amount of money on marketing each year (~2% of revenue), so we are careful where we place our precious dollars and utilize avenues like social media and various forms of on-line interaction extensively. I think we do pretty well in terms of return on investment in this regard. Sponsoring an MMA fighter on prime time PPV or national TV (FOX) is not as expensive as you may think, if you follow the right process and know the right people. The primary reason we do this is to reach the 18 – 34 age demographic (loyal UFC viewers) for brand awareness, as this is the group our software engineers primarily come from. It has indeed helped us in a highly competitive market.

ShirtAs importantly, we do it because we think it reflects the personality of our young firm and because of the great respect we have for athletes in the sport. I have gotten to know a number of the fighters we have sponsored. To a man (and woman), these people work incredibly hard in their pursuit of greatness. They experience The Struggle. All have faced adversity in their fights and, win or lose, have come out on the other side better at their profession.

I have also found many of the athletes in this sport to be uncommonly good people. They conduct themselves with honor and generally all give back to their community. They are pretty amazing people, really. These characteristics are the ones that we hope to embody as a firm.

UFC Fight For the Troops

dog-tagsThis Tuesday November 6th, Virtustream and KOreps have teamed up for a special promotion surrounding a UFC ‘Fight For The Troops’ event at Fort Campbell, Kentucky. We have put together a limited edition shirt for the event, and UFC fighters George Roop and Jim Miller will be on site at Fort Campbell to hand out these shirts free to military personnel while supplies last.  In addition, Virtustream will donate $1 to the Intrepid Fallen Heroes Fund for every Facebook ‘like’ or Twitter ‘re-tweet’ of a photo that features a military member with this shirt.

As we like to say at Virtustream, Boom!

Posted in Cloud Computing, Uncategorized | Tagged , | Leave a comment

The Math

Author: Rodney Rogers (Chairman/CEO, Virtustream; @rjrogers87)

The most savage controversies are those about matters as to which there is no good evidence either way.  Persecution is used in theology, not in arithmetic.” ~Bertrand Russell

math2It has been proven that markets, over time, will revert to the mean. It has also been proven that marketing, over time, will revert to the math. I love the math.

This Cloud technology market of ours is an extraordinary one. It’s disruption defined, creating an explosive market that will relentlessly expand. As cloud software and service providers, we are all furiously working to “get ours”. We are all craning our necks and straining to get our message heard above the fray. We all have to market. But if your software is really good, you have the math. Nothing, over time, beats the math.

This week, Cloud Spectator released the results of an independent performance analysis they did on the top 14 cloud service providers worldwide as defined by the 2013 Gartner IaaS Cloud Magic Quadrant (MQ). The only missing data point is CSC, who chose not to grant access to Cloud Spectator for this performance test. [NOTE: VMware’s vCHS and Google GCE were still in beta at the time of study.] Continue reading

Posted in Application Tiering, Business-Critical Applications, Cloud Computing, Cloud Spectator, Enterprise Workload, Hybrid Cloud, Mission-Critical Applications, On-Demand, Performance, Private Cloud, Public Cloud, SLA, Software, uVM | Leave a comment

Understanding Virtustream uVM – Enabling Cloud Value

Authors: Kevin Reid (CEO/CTO), Matt Theurer (SVP/Co-Founder, VCDX#17) and Sean Jennings (SVP/Co-Founder, VCDX#16)

With the recent release of the Cloud Spectator “IaaS Performance and Value Analysis” report, several people have asked for more detail about how Virtustream uVM technology, which is part of Virtustream xStream Cloud Management software.

uVMThere is a significant point to be made about the benefit of the uVM which is only implied in the report.  The report highlights that some Cloud providers had variations of the resources based on their instance sizes.  That emphasizes two important points.

Continue reading

Posted in Business-Critical Applications, Cloud Computing, Cloud Spectator, Enterprise Workload, Hybrid Cloud, Mission-Critical Applications, On-Demand, Performance, Public Cloud, SAP, SLA, Software, uVM, Value, Virtualization | Leave a comment

Virtustream Delivers the Highest IaaS Value (Price/Performance) in the Industry

Authors: Kevin Reid (CEO/CTO), Matt Theurer (SVP/Co-Founder, VCDX#17) and Sean Jennings (SVP/Co-Founder, VCDX#16)

This has the potential to be a long post. Apologies in advance. We wanted to make sure we provided enough detail. Summary: Virtustream IaaS Cloud delivers 10% greater Price/Performance than the Top 14 IaaS Cloud Providers and 30-50% greater value than the field.





Screen Shot 2013-10-15 at 8.14.32 AM





[If you’d like more details, continue reading…]

In 2011-2012, we had been in the business of delivering managed and hosted services to Fortune 2000 clients for a few years and had been transitioning many of our clients to cloud-based consumption models.  We felt confident that our technology and operations were at a level where we could compete against much larger cloud providers. The 2012 Gartner IaaS Cloud MQ was going to be a big test of whether or not our IaaS cloud services, xStream software and IaaS revenues would match our belief that we had progressed enough to be considered a significant cloud provider.

Inclusion in the 2012 Magic Quadrant established Virtustream as a cloud provider of record. It brought our name to the forefront of this emerging industry.

Inclusion in the 2013 Magic Quadrant meant that we could consistently deliver financial results and were evolving technically to align with the rapidly changing demands of cloud services.

The Gartner IaaS Cloud Magic Quadrant establishes a technical baseline of cloud computing features focused on service-enablement, as well as standard capabilities such as secure access, monitoring and system management. Clients definitely understand the significance of providers being on the Magic Quadrant, but as they get more sophisticated in their cloud strategies, they have begun to ask us much more complex questions about how they can better server their business needs.

Not all Clouds are created Equal

With that in mind, we wanted to go a couple steps further to focus on aspects that are critical to any Enterprise or Government IT organization that is considering running their production or mission-critical applications in the cloud. These are groups that are expected to deliver not only high-available applications to the business, but be able to stand behind them with measurable response times and SLAs.  These groups also have to be able to cost-justify expenditures, so any performance results needs to be considered against the cost to deliver those results. Fast is excellent, but not if it breaks the bank. Continue reading

Posted in Cloud Computing, Cloud Spectator, Enterprise Workload, Mission-Critical Applications, Performance, SAP, SAP HANA, Software, Trust, uVM, Value | Leave a comment

2013 Future of Cloud Computing Survey Tells It Like It Is: Cloud Drivers, Inhibitors and Opportunities

Michael Skok Headshotuted Article By Michael Skok, General Partner, North Bridge Venture Partners

At this year’s GigaOM Structure Conference in San Francisco last month, North Bridge Venture Partners, GigaOM Research, Virtustream and close to 60 additional collaborators unveiled the results of the 2013 Future of Cloud Computing Survey.

This is the third consecutive year we’ve conducted this study into how the business and IT community views cloud computing – its drivers, inhibitors and opportunities. I’d like to extend my thanks to Virtustream and the other collaborators who helped make this our largest survey yet, with a sample of 855 respondents — over a third of whom were C-level executives.

This year’s findings point to a cloud market that continues to grow, with 75% of respondents reporting the use of some sort of cloud platform – up from 67% last year. It also exposed several shifts in why and how cloud computing is being used, as well as changes related to obstacles to adoption, where cloud decision-making resides within organizations today and how the vendor landscape is evolving.

Key takeaways include:

  • Business is driving cloud adoption in the Everything-as-a-Service era (EaaSe)
  • IT is investing heavily to catch up and support consumers graduating from Bring Your Own Device (BYOD) to Bring Your Own Cloud (BYOC)
  • Opportunities exist for vendors to mitigate reliability, security and complexity challenges associated with increasingly hybrid cloud environments
  • Industry concerns around interoperability and vendor lock-in are provoking change in the form of more APIs and greater data portability

The Future of Cloud Computing graphicAmong the many survey results I find interesting is that business is driving cloud adoption – which is a direct reflection of their use of cloud services in a “boundary-less” way where they can seamlessly integrate them at home and work across all their devices. That boundary-less computing shift is indicative of an evolution from BYOD (Bring Your Own Device) to what we’re calling BYOC (Bring Your Own Cloud), which has all sorts of implications for companies in terms of accepting, adopting and updating usage policies.

Given who the drivers of cloud adoption are, the reasons shouldn’t surprise you. Agility (54.5%) and scalability (54.3%) were cited as the primary drivers for cloud adoption, followed by cost (48%), mobility (25%) and innovation (22%) to drive competitive advantage.

Another interesting development of note is that security is slowly but surely starting to lose its label as the primary inhibitor to cloud adoption. This year, 46% of respondents named security concerns as the #1 factor prohibiting further cloud adoption as compared to 55% of respondents in 2012. Meanwhile, 46% of respondents describe the management of IT as “more complex” with the growing use of cloud components, and vendor lock-in (35%), interoperability (27%) and cost (28%) continue to be cited as barriers to adoption. Reliability (22.3%) and complexity (21%) round out the list, reflecting real-world obstacles to an “always-on” services infrastructure.

Going forward, more than three-quarters of respondents expected hybrid clouds to be at the core of their cloud strategies in the next five years, clearly showing that the market is ready for more solutions that combine the security, control, compliance and reliability benefits of a private cloud with the scalable, agile and cost-effective capabilities associated with public clouds. Of course, the key for IT will be to seamlessly aggregate internal and external cloud services to give business the speed and ease of use it needs.

What do you find most surprising when reviewing our 2013 results? I encourage you to take a look at the full report, and let me know your thoughts via Twitter @mjskok or on my site.

Posted in Business-Critical Applications, Cloud Computing, Cloud Environments, Cloud O/S, Enterprise Workload, Hybrid Cloud, Mission-Critical Applications, Private Cloud, Public Cloud, Security, SLA | Leave a comment

Encryption and Key Vaulting Give Virtustream Customers Enterprise-class Cloud

To make xStream into a leading enterprise-class cloud orchestration system, Virtustream brought in SafeNet data protection solutions give enterprise customers security and control of their data. While Virtustream has been using SafeNet’s authentication solutions for almost two years, they’re increasing their cloud security offering to include SafeNet ProtectV and KeySecure for encryption of virtual images and key vaulting in the cloud.

“Security is part of our genome,” explained Gregsie Leighton, CISSP and Chief Security Architect at Virtustream. “We created an enterprise-class cloud solution, and a core component is enabling control and compliance. This means encryption for data at rest and in motion.”

And of course, whenever you implement encryption, it’s important to store those encryption keys in a secure key vault.

“ProtectV is able to protect the VM instance, and the encryption keys are stored securely with KeySecure,” said Chen Arbel, Director of Business Development at SafeNet. “SafeNet solutions protect data regardless of where it resides, affording separate security administration duties, enforcing granular controls, and establishing clear accountability with audit trails and compliance reporting. And xStream customers can easily scale the use of encryption in very large, dynamic cloud environments.”

One of the most important pieces of the puzzle was making sure that Virtustream’s customers not only have access to encryption, key vaulting and authentication, but are able to decide where to implement that security based on their own needs. Since different types of data are subject to different regulations and corporate policies, Virtustream wanted to make sure that each organization could easily add as little or as much security as needed to protect their data in the cloud.

“We automated processes within the xStream portal so that SafeNet’s security solutions are part of the ecosystem,” Gregsie said. “That automation cuts down on manual labor, reduces human error, and makes our customers lives easier.

“ProtectV, KeySecure and SafeNet Authentication Service give our customers more assurance into their data security throughout the data lifecycle – from creation through deletion.”

You can find out more about Virtustream’s xStream at

Learn about SafeNet’s portfolio of data protection solutions, including ProtectV and KeySecure, at

Posted in Business-Critical Applications, Cloud Computing, Compliance, Consumption Based Software Licensing, Security | Tagged | Leave a comment