2015: The Year of Real-Time Collaboration, Cloud Globalization and End-to-End Trust


By Kevin Reid
CEO & CTO, Virtustream

As cloud computing adoption rates continue to soar in 2015, IT leaders will look to address the growing challenges of security and compliance, performance and financial control. Coupled with external factors such as the threat of data breaches and pending regulation changes around the globe, the enterprise IT landscape will continue to grow in complexity and opportunity in the year ahead.

predictions

To keep up with these changes, IT leaders will adopt real-time collaboration techniques, geotrust-supported technology, greater transparency with vendors and SLAs, and make strides toward an end-to-end trusted compute platform. Here are four predictions that look at how these issues will pan out in 2015:

  1. Real-time collaboration will address cyber threats: IT leaders will address the growing cyber security threats in the cloud with new and improved real-time collaboration capabilities, particularly in the federal space. Groups will share knowledge to quickly triangulate the source of cyber threats and form a satisfactory action as a joint initiative, rather than spending significant resources to do it independently. Given the massive amount of data breaches and security threats that emerged in 2014, this collaborative approach will provide IT leaders a more efficient way to identify and address threats rather than the siloed, independent processes taking place today.
  1. Global data sharing becomes easier, safer: Along with the growing pains of scaling cloud infrastructure for a global business comes the inevitable question of how to protect data across borders. Enterprises must adopt technology that supports the restriction of data or applications moving into territories that are forbidden, based on security or compliance requirements. This concept, called “geotrust,” will continue to proliferate among enterprises as countries promote an increasing amount of privacy laws and concerns.
  1. Greater demand for visibility and transparency: As new technology is adopted, IT leaders require greater visibility into security, performance and cost when it comes to managing these tools. Service level agreements (SLAs) and vendors must provide greater transparency into average and expected costs in order for IT to gain back financial control. This will also allow for planning and accounting for infrastructure changes, new resource requirements or the ability to quickly scale an app on-demand via consumption-based models that eradicate the fear of a shocking bill at the end of the month.
  1. The end-to-end trusted compute platform: To date, most companies have focused on the security of the data center and the applications/infrastructure it runs on. The reality is applications and networks are being accessed by a myriad of devices. In the Bring Your Own Device era, we’ll begin to see more emphasis on creating secure connections and assuring the integrity of the client side (smartphone, PC, etc). Multi-factor authentication and security between the device and the data center/applications that talk to that device will support the vision of a trusted, end-to-end compute platform.

New trends around BYOD, real-time collaboration and the globalization of the cloud will surface and define cloud computing in 2015. Successful IT leaders will address these trends by responding with a collaborative approach to security and compliance, a greater eye toward performance, and an increase in transparency to better manage these new tools.



Posted in Uncategorized | Leave a comment

The Misadventures of Cloud Computing: When Reliability Matters


Virtustream.highres.referral

When Reliability Matters

We love funny cat pictures just as much as the next person- and our family vacation albums are among our most valued possessions- but those are personal priorities. We generally think about business priorities in a different light, with minimal overlap with the likes of Mr. Whiskers. Keeping personal and business priorities largely separate is a natural and understandable inclination. Given that, why should your cloud service provider be any different?

The consumer cloud works exceptionally well for just that – consumers. But when it comes to the enterprise, a cloud solution is about much more than just data storage. It’s about what you are storing and how you are using it. It’s about backup reliability and business continuity. The lack of adequate security or an ERM strategy with continuous threat monitoring results in consequences far greater than data loss for an Enterprise. It can seriously impact customers and have a direct impact on corporate welfare. Simply put, enterprise-class data and information needs an enterprise-class cloud infrastructure to support and protect it.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters

 



Posted in Uncategorized | Leave a comment

The Misadventures of Cloud Computing: When Security Matters


Virtustream.web.enterprise

Fathers and sons. Start-ups and enterprises. Will they ever understand each other? As it is in life, when companies grow and mature they become responsible for more sensitive information, operate under higher stakes and restrictions and require more complex services to run their businesses and keep them at peak performance.

While he doesn’t see it, the alpha developer and his enterprise-IT  dad, though they are both in the same field technically, have vastly different needs. The advantages a web-scale solution brings to a host of use cases including a video-game start-up are very real, but the cloud is not one size fits all. There is a complexity that comes with the legacy mission critical apps and sensitive data inherent in an enterprise. Albeit not sexy, security, compliance, and reliability are non-negotiable for the Enterprise customer.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Cloud Computing, GRC, Mission-Critical Applications, Private Cloud, Public Cloud, SAP, SAP HANA, Security, Trust, Trusted Cloud | Leave a comment

The Misadventures of Cloud Computing: When Reliability Matters


Virtustream.web.backupplan copy

When Reliability Matters

While no cloud service provider can claim 100 percent uptime – not all cloud services are created equal when it comes to performance and reliability.

The issue of reliability becomes exponentially more important for large enterprises that are migrating their mission critical business applications to the cloud. For these organizations, downtime is not an option.

Virtustream is independently recognized for its commitment to assuring our enterprise customers the highest possible levels of reliability and performance. According to a recent report by independent analyst Cloud Spectator, we outperformed 14 of the leading infrastructure-as-a-service (IaaS) cloud providers worldwide, as defined by the 2013 Gartner IaaS Magic Quadrant.

While reliability is of the utmost importance, recovery is a close second. That’s why we have aggressive recovery point and recovery time objectives to ensure that if something does go wrong, downtime is at an absolute minimum.

Your recovery plan should be more than bemoaning the latest instance of downtime to your Twitter followers. It’s about reliability and service.  Because in the enterprise, we know if you wait – more than just ice cream will start to melt down.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Backup, Cloud Computing, Cloud Spectator, Disaster Recovery, Enterprise Workload, Hybrid Cloud | Leave a comment

The Misadventures of Cloud Computing: When Performance Matters


Virtustream.highres.whiteboard

When Performance Matters

We know as well as anyone that the cloud solutions market is crowded, and getting more so every day. Nowadays everyone seems to be a cloud provider; everyone has a solution they want to sell you, from legacy hardware providers with “platinum new cloud capabilities” to purists who pressure you to adopt “real” scalable cloud technologies.

But buyer beware: Not all cloud service providers are created equal. Many can generalize solutions or pontificate idealisms, but do they really have the right software and services for your particular enterprise needs?

Enterprise cloud users require a much more sophisticated level of performance than a personal user or smaller business. In an enterprise, the data is sensitive, and the stakes are high. It’s vital that a cloud provider that considers themselves “enterprise grade” can guarantee a high level of performance and availability and provides the SLAs to back it up. If its mission critical apps do not consistently run well in the cloud, any other promises just don’t matter.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Backup, Cloud Computing, Compliance, Disaster Recovery, Enterprise Workload, Mission-Critical Applications, On-Demand, Performance | Leave a comment

The Misadventures of Cloud Computing


We’re excited to debut the first illustration in a five-part cartoon series by Tom Fishburne, “The Marketoonist.” With some techie humor and a touch of irreverence, our “Misadventures of Cloud Computing” series sheds light on the day-to-day challenges facing CIOs and IT leadership teams as they navigate the complex enterprise cloud landscape.

We’ll be unveiling a new cartoon every Wednesday for the next five weeks that puts a comical spin on what really matters when selecting an enterprise cloud solution – security, reliability and performance. We hope you check back in regularly for a midday chuckle and we encourage you to share your perspective and experiences on each cartoon’s theme.

Virtustream.highres.server

When Reliability Matters

Server Huggers.  We all know them – the folks that are hesitant to say goodbye to something they can touch and feel, physical servers, for something distant and and intangible. And while it is not actually about “where to put the coffee maker,” cloud reluctance is usually an emotional reaction. Change can be unsettling.

At first blush, it makes sense. Enterprise IT departments manage complex landscapes and moving complicated, mission-critical legacy apps to the cloud is no small feat. And the thought of experiencing any downtime during the transition is a disconcerting one. Often times the stress and complexity of the transition can be misinterpreted as an aversion to the cloud all together.

But transitioning your enterprise to the cloud, even in the most complicated instances, can be a smooth, secure ride if you have the right partners on board to lead you through the journey. And while some IT departments feel like the servers they can see and touch are more safe or dependable than ones they can’t, both security and reliability are fundamental to enterprise-grade cloud service providers who offer continuous enterprise-wide monitoring on a large scale. They have a big stake in ensuring that your data remains safe and you experience zero downtime during and after the move to the cloud.

While it may be counterintuitive at first, moving to the cloud helps enterprise IT gain control of their systems and data, not the other way around. When less time and money is spent managing hardware and day-to-day upkeep, IT can put more resources into pursuing interesting projects that could make a significant impact on the business.

For more information, check out our LinkedIn page.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Business-Critical Applications, Cloud Computing, Cloud Environments, Cloud O/S, Enterprise Workload, Hybrid Cloud, On-Demand, Performance, Private Cloud, Security, Trust, Trusted Cloud, Uncategorized | Leave a comment

The Shift Away From Managing Security by Compliance to Managing Security by Risk


Kaus2

By Kaus Phaltankar, Chief Sales Officer, Enterprise Risk Management & President, Virtustream Security Solutions

Every day, enterprises are faced with new and constantly evolving threat vectors. The bad actors have to get it right only once, while enterprises have to defend themselves continuously and get it right every time!

Enterprise Risk Management (ERM) has emerged as an important business trend that builds a holistic and integrated approach to risk management across the enterprise. ERM encompasses a number of risk areas, including Information Technology (IT) Risk, Operational Risk, Regulatory Compliance Risk, Financial Risk and Reputational Risk.

The ERM risk areas are affected by the security within the enterprise. There is a paradigm shift within enterprises to move away from ‘managing security by compliance to managing security by risk.’ This moves the organization to manage security, Information Assurance (IA) or compliance from a discreet snap-shot-in-time ‘checklist-based’ approach to a real-time ‘continuous risk management’ based approach. ERM provides stakeholders and decision makers a risk view across the enterprise with detailed risk by departments, bureaus or information systems. The risk view is available on-demand, as well as over a period of time, to see how the Information System owners and leaders are managing risks.

This risk-based approach requires quantification of risks identified by various security and compliance tools monitoring hardware or software assets to business critical applications. The National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) defines the point security technologies, such as Asset Management Systems, Configuration Management tools, Vulnerability Scanners, Security Information Event Management (SIEM) systems, as well as Governance, Risk and Compliance (GRC) tools as ‘Sensors.’ The Department of Homeland Security (DHS) Continuous Asset Evaluation, Situational Awareness and Risk Scoring (CAESARS) framework refers to the sensors in its ‘Sensor Sub-system’ for collecting data about enterprise asset risks, which are then analyzed and quantified using risk scoring algorithms. These risks per assets are aggregated by Information System(s) and by mission critical functions supported by those information systems. The critical emphasis is on continuous risk monitoring that utilizes automation using machine-to-machine information exchange through standards-based protocols such as NIST Secure Content Automation Protocols (SCAP) on a continuous basis. The continuous nature of the analysis allows information system owners to quickly assess risk on a near real-time basis and become proactive in mitigating risks based using a prioritized response.

Regulatory requirements, such as GLBA, SOX, HIPAA, PCI or the Federal Information Security Management Act (FISMA) also mandate ‘continuous monitoring’ of information systems for protecting Personally Identifiable Information (PII) using Privacy Impact Assessment (PIA), as well as monitoring of unauthorized access to maintain the integrity of data and systems.

The biggest challenge for enterprises today is to truly understand what it means to conduct continuous risk monitoring and what that entails. The key requirements of continuous risk monitoring under RMF are:

  • Close tracking of change and configuration management of assets
  • Monitoring of Information Assurance (IA) and governance controls using automated tools
  • Quantifying risk based on risk algorithms and computations
  • Document creation, updates and reporting

Enterprises need a scalable data ingest, collection, storage, processing platform that delivers an accurate and timely view of the IT and operational risks by providing a 360° view of the asset’s software and hardware inventory, vulnerabilities (VUL) and secure baseline compliance established on approved baselines. Other key attributes of the Continuous Risk Management platform are:

  1. Scalability: The continuous nature of data collection automatically imposes scalability requirements. Scalability can be imposed across data storage per size, frequency, and retention metrics. Additionally, the platform needs to demonstrate data collection flexibility and analysis scalability. It must also be deployed stand-alone or setup in a tiered architecture to accommodate distributed enterprise implementation.
  2. Agnostic Sensor Coverage: The data ingest should be configurable to support multiple sensors based on current point technologies, as well as a multitude of data inputs such as NIST SCAP, XML, JSON, CSV, and other input formats.
  3. Creating Common Operational View: The solution needs to provide a singular view of the risk by asset, by application, by department, by agency or by enterprise as a whole. This view needs to be on a single pane of glass with full drill-down and drill-back capability to view what data contributed to the overall risk at various levels of the dashboard view.
  4. Data Warehousing: A platform should provide options for data warehousing based on volume of data from SQL to a NoSQL Big Data solution. The SQL-based RBDMS data storage (e.g., MS SQL, ORACLE, DB2) provides a traditional data store for structured data, while a NoSQL Hadoop solution offers a store for structured and unstructured data with a linear store and processing scalability using a multi-node cluster solution.
  5. Monitoring Management/Work Flow Capabilities: Enterprises need a solution that controls all workflow requirements in terms of risk monitoring, compliance with baseline specifications, as well as response metrics and related mitigation steps. Built-in workflow managers should be customizable to map business processes, assign risk values and trigger alerts for mitigation actions.

At Virtustream, we offer our Viewtrust software for Enterprise Risk Management, Cyber Situational Awareness and Regulatory Compliance monitoring on a continuous basis. The Continuous Risk and Compliance Monitoring (CRCM) capability provides:

  • Proactive risk management using a standards-based risk management framework
  • Continuous monitoring of each asset for compliance and risk by building a 360° view of each asset within the enterprise using multitude of sensor data
  • Massive processing capability using Hadoop Big Data solutions to process volumes of data in a variety of formats quickly (volume, variety and velocity). The data could be structured or unstructured.
  • Perform threat and impact analysis using external threat intelligence from US-CERT or commercial inputs, asset configuration policy and hardening guides such as Security Technical Implementation Guides (STIGs)
  • Business Intelligence (BI) analytics and reporting capability for analyzing massive amounts of data for diagnostics and prioritizing mitigations
  • Enabling near-automated mitigation by interfacing with other tools and technologies

The Viewtrust Continuous Risk Monitoring approach truly enables Enterprise Risk Management and allows enterprises to shift away from ‘managing security by compliance to managing security by risk.



Posted in Uncategorized | Leave a comment

Forecast: Cloudy With a Chance of Encryption


encryption

By Tom Evgey, Sr. Security Analyst, Virtustream

The RSA conference in San Francisco, CA, is certainly one, if not the largest network security conferences in the world. Rows upon rows of vendors attend showcasing the latest and hottest tools and services in the industry in an effort to convince you to buy their ultimate solution for protecting your closest assets.  Buzz-words like SIEM integration, log correlation, cloud security and forensic analysis were on almost every flyer; long lines of the security field’s finest, crowded around in search of that one solution that will put all of us “enemy of the state” enthusiasts at ease.

Perhaps, as we each head back filled with promotional material and swag, we’ll take a closer look at some of the capabilities that each of the vendors had to offer. And hopefully, we’ll find something we can bring back to our own environment and integrate it into our own security mission in an effort to proactively prevent the next big hack.

Securing The Perimeter

The common strategy has long been protecting our assets from the outside in. We  utilize firewalls, Intrusion Detection System (IDS), proxy, maybe a Web Application Firewall (WAF) and  Anti-Virus (AV), all perimeter based.

Perhaps if we’re really paranoid, we install a host-based IDS. We run our weekly or monthly scans, we have a log correlation system to collect our logs, we have HTTPS running and OTP protecting ALL of our applications, and, every once in a while, we test our ‘Incident Response Plan’ to check our readiness.  But is that really the correct approach? I don’t think anyone would argue that the companies who have been breached in the last few months were lacking any of the above methods. Still, we hear about these large-scale attacks, week after week, with a cloudy forecast for more attacks. So what ARE we doing wrong?

Our Challenge

As a Cloud Service Provider, we have a unique challenge at Virtustream.  We manage massive amounts of customer data. Our clients are all using our processor power, disk I/O, network packets and memory resources and their consumption gets reported on by the minute, based on need and environment, and all while maintaining Confidentiality, Integrity and Availability (CIA) and staying current with every compliance framework. We need to find the right balance of protecting our customer data and Intellectual Property (IP), while providing accessibility to our user base… all at a very competitive price.  It starts with a simple login to our web portal all the way to large scale cloud bursting.

Our Solution   

So, how do we protect our data? What is the missing component in our security strategy? How do we maintain the integrity of our data without losing that accessibly?  If you guessed “getting another scanner or IDS with every bleeding edge signature” or yet another firewall you should reconsider (and maybe take another quick glance at the article’s title).

The answer is encryption. Encrypting our data ensures WE have the keys to our assets, no matter where the data goes.

Here at Virtustream, we leverage the capabilities of Vormetric Encryption. Not only is the data encrypted and protected from any inbound threats, Vormetric takes ‘access as needed’ to another level. As part of a policy, it can restrict any user from accessing the data, by way of encryption. The prompt of ‘access denied’ now has a new meaning.

Final words

Encryption of the data is a must in any environment. It will be the one tool that will save the day when all other tools have failed. It will save the day when the marketing manager accepts a ‘LinkedIn’ invite which redirects to an HTTPS site hosting malware. It will save the day again when someone in finance opens a PDF file containing embedded malicious code. All of these scenarios, as I’m sure you know, happen every day. Our weakest links are always internal. Who does port knocking anymore, or IDS evasion, or tries to bypass the firewall when a simple, cheap email with a scandalous or believable subject line is all it takes to get cracked wide open?  Risks mostly come from inside and encryption can prevent the access to data, even in the case of a breach.

We live in a fast evolving environment, where everything is changing rapidly and we try diligently to stay one step ahead of the attackers. The challenge is that we have to be right EVERY time, while they only have to be right ONCE.

So, my suggestion to you is to brighten up your cloud forecast with some encryption.  You’ll be glad you did!



Posted in Cloud Computing, Encryption, GRC, Intel TXT, Mission-Critical Applications, Security, Uncategorized | Leave a comment

Report from the Trenches of RSA 2014


pete n

By Pete Nicoletti, Chief Information Security Officer, CISO, CCSK, CISSP, CISA, CCNE, FCNSP

The planets smartest security minds have descended by the thousands on the Moscone Center here in San Francisco desperately searching for their personal holy grail. They are looking through the myriad of security related things: the next generation of something; a hot new startup that could be worth billions; new tools and stuff that is cheaper, better, faster; a new market segment, built on big data and, of course, it must be cloud enabled. The vendors have spent millions. The restaurants, streets, halls, classes and hotels are packed with people who are all part of the huge ecosystem that protects our networks, our data, our intellectual property, and our way of life. And, here in this hyper-nexus of thousands of solutions, there is still just one single really good “answer” to be found.

What is that answer? The “answer” was invented thousands of years ago to protect war secrets and has been improved upon over time to be virtually unbeatable. The “answer” should be the first thing that is researched, selected, budgeted for and deployed in every environment. It should be required by every compliance framework and verified to be correctly in place by every audit. The “answer” should be the one thing that everyone who uses a credit card, or sees a doctor, or makes a phone call, or sends a picture by email, or uses a critical company-provided application should simply assume is in place protecting them.

Simply assuming that this “answer” is in place is wrong and in some cases dangerous. We have counted on Target, Neiman Marcus, Michaels, Linked In and even the NSA to protect our information and secrets and they have all failed us. They all knew and had access to the “answer”, but in every case above (even in light of all the other thousands of recent breaches) they didn’t deploy the “answer” correctly or completely.

We hear about security failures and breaches every day, but there is also an ever increasing incidence of “non-events”. Companies, universities, vendors, web sites, banks, and agencies are being breached and every security tool they have in place is failing except one. And because this one tool does not fail, these breaches are classified as non-events and don’t have to be reported, don’t make the news, don’t cause embarrassment, or end careers. They don’t even require restitution via credit monitoring protection or refunds. State breach laws give these organizations safe harbor and there are no costs or consequences to face even though they (almost) utterly failed. These organizations have the “answer”, and they deployed it correctly and it did not fail them.

Things must change. We can’t allow the largest theft in the history of the human race to perpetuate as our intellectual property is stolen away through copper and fiber and sent west to China. We should collectively rebel and refuse to pay the billions of dollars of breach costs that we as consumers all have to absorb. We should vote with our wallets and only buy our goods and services from providers that know the “answer” and know how to deploy it correctly.

So, what is the “answer”? Encryption.

You will find this “answer” everywhere at RSA: Major chip makers like Intel are building in-silicon security functions to make the process easier, faster and have less overhead. Major ERP Vendors like SAP are investing in supporting strong encryption. Vendors like Vormetric are offering extensive solutions for encryption deployment and management. Cloud software and services provider Virtustream is baking encryption in and insisting clients bring their own or buy the service.

I personally didn’t learn the “answer” without first paying a price. It was only after responding to hundreds of breaches and managing the security for thousands of companies (including having one of my own companies get hacked) did I figure it out. Now that I have though, I take every opportunity to let everyone else in on it. Case in point: The license plate on my vehicle is ENCRYPT!

So you see, I am doing my part to spread the word on encryption and make the world a safer place, now it is your turn! Encryption done right can solve some many of our critical data protection challenges, save us trillions in thefts and billions in breach costs, and virtually eliminate our failures. So, when and how are you going to ENCRYPT?

For more info on security, check out this press release from RSA entitled “Vormetric Demonstrates SAP HANA® in the Cloud with Enhanced Security” or our  joint white paper “Security in the Cloud for SAP HANA” by Intel with SAP, Virtustream and Vormetric.



Posted in Uncategorized | Leave a comment

Virtustream teams with Canara to reduce customer energy bills


By Alan Pearson, Director of Datacentres, EMEA, Virtustream

Virtustream has been at the cutting edge in Cloud technology and we continue to innovate by working with our key partners to deliver systems useful to our customers in the Data Center space. We are pleased to say our first installation of Canara Branch Circuit Monitoring Subscription Solution outside North America at our data center in London went off without a hitch. With a modular-based design combined with standard Cat5e cabling, the hardware was able to be completely installed in less than 45 minutes. The first side of the panel took only 15 minutes while the other side was completed in under 10 minutes.

Thanks to Canara & Bounds Electrical Services, we are the first Data Center in the UK to deploy this innovative technology and further manage our energy usage to pay more than lip service to green credentials in our Data Centers. Read more on our work with Canara.

Canara-Branch-Circuit-Monitoring-at-Virtustream-1024x772

Caption:
With a little help from Marcos Almonaci (left), I was able to use Canara Connect to set all the circuit owners and verify our readings using his live PDU outputs (turns out that ours was much more accurate).



Posted in Uncategorized | Leave a comment

It’s a Hybrid World


By Stacy Hayes, VP Strategic Alliances & Channels, Virtustream

Cloud adoption at the enterprise level continues to increase as core functionality continues to improve.  Paramount on the list of “must haves” is the ability to seamlessly, easily and securely move application and data base workloads from one platform to another.  Additionally, customers are keen to avoid vendor lock-in, so true ability to move workloads between both public and private clouds, with no strings attached, is a must.  Workloads must also be dependably secured, with the same network and application controls in place both at rest and in flight.  Enterprise cloud consumers are still a long way from realizing this utopia but thanks to advances like Cisco’s new InterCloud, they’re a lot closer today.

Virtustream has long been a believer in hybrid – our software supports private cloud, public cloud and hybrid connection between the two.  Cisco InterCloud is the beginning of simplifying the interconnection between enterprise private sites and public clouds, enabling secure network connections and selection of public cloud connections. We view this capability as a critical component to enabling enterprise adoption of the cloud and feel that the maturity of the concept will continue to expand further in next few years.  Virtustream xStream enhances hybrid connections further by adding high levels of security, compliance and performance SLAs for production and mission critical apps.  We also work closely with Cisco and our software enhances both Vblock and FlexPod-based data centers with enterprise class private and public clouds.  We see a future of interconnected clouds linking business verticals like healthcare, manufacturing, education together with both private and public clouds.

From the Cisco partner perspective, this is a great example of the value of the Cisco relationship and the alignment they have with their service provider community.  It begins by listening to the customer through the ears of their partners.  By actively engaging their partner ecosystem, Cisco ensures, in real time, that their innovations meet the needs of their target audience throughout the product lifecycle.  As a launch participant, we have the ability to influence and shape the offering prior, during and after launch.

When it comes time to go to market Cisco really shines.  They have some of the best organized and well thought out partner programs in the business.  Service providers depend on the Cisco brand and reach to connect to the market.  In order to make that connection, Cisco created the Cloud and Managed Service Provider (CMSP) program to ensure successful end-to-end interaction from the service provider, through the Cisco field sales force and ultimately to the customer.  The cornerstone of the program is a third party, on-site audit of the service offering.  In order to become “Cisco-Powered” the provider must pass this rigorous inspection.  Cisco is not just looking to verify that their parts and pieces make up the service.  Adhering to ITIL standards, they ask for an evaluation of end- to- end solution delivery. Data centers, service desks, policies and procedures—everything is inspected in a process that takes two full days.  It’s not for the faint of heart but when it’s over, the level of assurance that both the customer and the Cisco sales force have in the service is priceless.

This is consistent with Cisco’s overall interaction with their partner community.  Unlike the other popular compute and networking vendors, Cisco’s approach to solution delivery is 100% partner-centric. Service provider offerings augment instead of compete with Cisco’s solution approach.  Because there are many partners, customers have a choice and vendor lock-in is avoided.  The competitive forces of the service provider market ensure that the highest levels of quality and ingenuity are delivered to the customer.  The ideas are fresh and customer driven, guiding not only service provider offerings but technological innovation like Cisco InterCloud.    We’re pleased to be part of the Cisco ecosystem and excited about collaborating with them in expanding the use of cloud into the enterprise market.



Posted in Cloud Computing | Leave a comment

The Cloud Market will Go Forth and Multiply, Though Not at the Expense of Quality


By Simon Aspinall, Chief Vertical Markets, Strategy, Marketing, Virtustream

2014 will see the cloud market step out of adolescence and become enterprise ready. As enterprise organisations realise the significant impact that embracing the cloud can bring in terms of productivity and agility, the market will grow. However, at the same time there will be greater differentiation within the cloud industry as businesses become more savvy around what they want from the cloud.

1. Heightened security and compliance requirements will lead to national clouds.

In the wake of this year’s revelations of the unprecedented level of NSA surveillance, the cloud marketplace will be largely focused on security and compliance. Encryption will become an ever more important attribute for cloud offerings to possess while businesses will be increasingly concerned with advanced security features such as malware/virus prevention, BIOS and hypervisor authentication.

Heightened security concerns will also dictate the types of clouds that we will see being deployed in 2014. Organisations will increasingly look to deploy techniques such as geo-tagging and geo-fencing to track the location and movement of their data. This desire to know the exact whereabouts of data will see more organisations looking to move to nationally built clouds that operate under local laws, rather than large, multi-national clouds. This does not mean however that there will be a decline in multinational clouds as there is still a huge untapped market that will be moving to the cloud in 2014.

2. Cloud market will double and divide.

The cloud market is set to double in 2014, with an ever-increasing share of enterprise IT moving to the cloud within the next five years. Hand-in-hand with this growth will come the sub-division of the cloud market. As enterprise organisations become increasingly aware of the cloud deployments that will be most beneficial for their business, we will see greater differentiation in the marketplace. Rather than the single cloud market that we have grown accustomed to hearing about, 2014 will see it divided into specialities. In particular, large public clouds are proving very popular for the development of new applications and SaaS, while we will also see the rise of specialised clouds that will support certain verticals, such as migrating traditional enterprise applications.

3. Majority of databases and ERPs will move to the cloud.

Over the next two years, the majority of databases and ERPs will move to a cloud (private/public or hybrid). We will see significant improvements in cloud performance in 2014, in large part due to improvements in cloud management platforms, and the use of flash memory and flash storage, with read/write-times becoming increasingly quicker allowing businesses to access large data sets more efficiently. Over two thirds of enterprise organisations are looking to move business-critical applications such as ERP to the cloud by the end of 2014, demonstrating an evolution of thinking about enterprise applications in the cloud. Large UK businesses in particular are accepting that moving these core applications to the cloud will bring great improvements to their organisation in terms of productivity, agility and competitiveness.

4. Hold your horses on cloud brokerage.

This growing commitment to the cloud will mean that the large majority of enterprise organisations will be using multiple clouds (private/public and hybrid) next year. With the desire for cloud on such a large scale, businesses will want integrated cloud management capabilities or multi-cloud federation. Seventy-five per cent of cloud implementations in 2014 are set to be hybrid, as businesses look to find a mix of solutions that best fit their needs. However, the enterprise is not yet ready to deal with multiple suppliers. Increasingly, functionalities such as compute, storage and network will be provided by a single interface as efforts are made to adapt to the needs of enterprise through the integration of infrastructure, but the diverse nature of clouds and underlying technologies make seamless integration difficult. This means that, while cloud brokerage will come into play at a later stage, 2014 will not be its breakthrough year.

5. OpenStack not ready to be welcomed in by the enterprise.

There will also be no major breakthrough for OpenStack in the enterprise family. While it is a great alternative for the SaaS and developer market, it is still at least two years away from having the key features to be trusted as an enterprise-ready platform (performance SLAs, security, compliance, hybrid for example). OpenStack’s current offering is very basic (mainly provisioning) but this is not to say that it will not feature in the enterprise in near future. In addition there are currently many competing flavours of OpenStack from multiple vendors(14+). This should not come as a surprise, after all, if we look at how long Linux took to mature and take hold we can appreciate that this is a natural opensource process that OpenStack is going through. Its time may well come but the rigorous demands of the enterprise dictate that 2014 will not be its year.

6. M&A on the radar.

Traditional hardware vendors (compute/memory/network/storage) operating in siloes will come under increasing pressure in 2014. Those focusing purely on compute or storage for example will struggle to make headway as it becomes clear that to make the cloud work for you, a software-based approach is needed which combines all four elements. We’ve begun to see this consolidation as Cisco, IBm, HP and others launches integrated hardware solutions. As these traditional vendors look to play catch up, we will see a flurry of mergers and acquisitions.

Similarly there are number of companies providing key functions in cloud management that will continue to ally and combine in 2014. We saw early pioneers like Savvis and Terremark acquired by telcos. In 2013 we saw the acquisition of businesses like Softlayer (by IBM), Tier3 (by Centurylink), Cloupia (by Cisco), Nicira/Dynamic Ops (by vmware) and a number of smaller businesses being acquired as major vendors assemble cloud stacks and key functions. With the majority of enterprise IT spending moving to the cloud this market will continue to grow and remain very dynamic.

We will also see plenty of innovation and investment over 2014 in the field of SaaS. A large number of early stage businesses are looking to displace traditional software vendors with new SaaS Offerings. The level of competition and pace will continue to rise in 2014

7. On the horizon.

There are new functions that are still yet to be delivered in the cloud. Improvements will be made in the fields of security, compliance, performance assurance, application monitoring and federation but all are in the early stages of development and will not be completed in 2014.

2014 will see great leaps taken forward as the enterprise ramps up its adoption of the cloud. However this is still a burgeoning industry and things will not happen overnight. Elements such as cloud brokerage and opensource may well have a part to play in the future, but 2014 will not be their year. We must remember that the cloud market is like any other and, while progress may be rapid, the maturity in an industry will take time.



Posted in Big Data, Cloud Computing, Cloud Environments, Cloud O/S, Enterprise Workload, Hybrid Cloud, Private Cloud, Public Cloud, Security | Leave a comment