Roundup of Virtustream @ SAPPHIRE


SAP_SAPPHIRENOW_K55_C_r_p

Round of Virtustream @ SAPPHIRE
by Michael Hoch
SVP of Cloud Advisory Services, Virtustream

SAP SAPPHIRE is always the biggest event of the year, with over 17,000 customers, prospects, and SAP experts rushing to Orlando to hear the latest and greatest from SAP.

This year, Virtustream @ SAPPHIRE was our biggest event in the history of the company. Our stellar booth sat right in the heart of the conference floor, with four stations where customers and partners could dig into critical topics for SAP customers interested in Enterprise Cloud solutions.

Of the hundreds of customers we spoke with over the three days, here are three themes that kept coming up again and again, by customers in almost every vertical, of almost all shapes and sizes:

  • Is real-time reporting with SAP S/4HANA actually real, or is it SAP hype? SAP’s announcements at Sapphire are often very forward-looking, bleeding edge technologies that have the promise of something great, but are just hitting General Availability. Many attendees wanted to know if S4HANA is real, and should be considered today, or if they should let it “burn in” for a while before putting it on their roadmap.

Virtustream encouraged customers to look at it now, today. A few weeks earlier, Virtustream’s announced the first production S4 HANA Customer in the Cloud with Don Whittington, CIO of Florida Crystals, took the plunge and completed the full migration to S4HANA in just 4 weeks. Also, Virtustream and its partner, Infosys, were demoing a real-life S4 Simple Finance solution that is being used today for by major automotive customer.

SAP users should be cautious about going too far, too fast with S4 Simple Finance – to use it does required a database migration for ECC onto HANA. However, SAP users should charge ahead with its close cousin S4 Central Finance – this can be deployed in a sidecar manner, without requiring a full ECC migration to HANA. It can connect to non-SAP systems using SLT. And it can greatly improve the speed, accuracy, and flexibility of financial reporting.

  • How can a CTO transform their on-premise infrastructure to become a real “IT-as-a-Service” provider? The enterprise cloud market has matured incredibly rapidly over the last 2 years, including support for SAP HANA. When Virtustream announced the very first ECC on HANA in the cloud, and launched its Cloud Service for SAP HANA at Sapphire 2013, we spent 95% of our sales efforts educating customers about how to buy SAP and HANA from a cloud provider, including what a true utility model is (as opposed to managed hosting), what kinds of SLAs to expect, how DR can be architected, which security frameworks could be supported, and so on.

Now, just 2 years later, most customers know what they want, and how they want to buy it. The big question is: Can we do this ourselves, in our own DCs? Or is “cloud” only available via a third-party? Large and small IT shops have always wanted to be able to provide fast-to-deploy, easy to manage, utility-based services for their internal groups. There’s a large existing investment in hardware and operations expertise that can’t be thrown away. And, despite the advanced security capabilities of many enterprise cloud services companies (including Virtustream), there are still many situations where the data must remain on-premise.

Virtustream’s Position: The full enterprise cloud IS available in an on-premise model today. A major manufacturing firm had a 0-minute recovery point requirement for their SAP HANA systems, which could only be met with a dual-site private cloud deployment within a metro region. Virtustream, Infosys, and VCE combined forces to provide a true enterprise cloud solution: utility-based multi-tenant for use by dozens of internal business units, highly secure, scalable, with near on-demand deployment capabilities for new projects sponsored by the business. Once fully deployed, the customer will be able to offer full “IT-as-a-service”.

  • Can a company with PCI, HIPAA, ISO, FedRAMP or other high security and compliance requirements really make use of an Enterprise Cloud? Ever since Amazon first launched their public cloud, there’s been concern about security in the cloud. Customers are rightly paranoid about maintaining their security frameworks. And in today’s climate, audit and compliance is not only a critical security capability, but a high cost of doing business. This is still the number 1 reason some customers say they may do test/dev in the cloud, but never run production.

Virtustream’s Position: It’s time to take a fresh look at security, both on-premise and in the cloud. With advancements in database encryption at rest, in motion, and in use, improvements in security hardening and user auditing, and mature end-to-end risk and compliance management offerings, and the ability to know where your data can and can’t move to using geo-fencing, the cloud is often *more* secure and compliant than an on-premise environment. It’s not easy, it’s not simple, but with sufficient expertise and experience, and the right tools and processes, a highly secure enterprise cloud is available and in use by major commercial and private sector companies today.

The approach we recommend to customers is to be as specific as possible about the security, audit, and compliance requirements for your SAP and non-SAP workloads. Use that as your calling card to evaluate enterprise cloud providers. Then collaborate with your cloud vendor on the design, risk profile, management processes, RACI, etc. The cost should be the last area to review: Once you have a design that meets your requirements, 9 times out of 10, your cloud vendor will work with you on how to get it at an attractive price. Of our 170+ SAP customers, we have not once met a security profile that we couldn’t meet or exceed. The cost was sometimes high, but the savings from moving the cloud generally far outweighed the additions for the extra security.

 About the author

MichaelHochMichael Hoch is Virtustream’s SVP of Cloud Advisory Services



Posted in Uncategorized | Leave a comment

Hybrid Cloud is the New Black


Hybrid-Cloud-is-the-new-Black

Hybrid Cloud is the New Black
by Chris Hale
Vice President of Technical Marketing, Virtustream

Flexible, dynamic IT is all the talk, and Cloud Computing (especially the hybrid cloud) appears set to become the new business network.

The challenge for large enterprises is that they are time poor, yet information and demand rich.  While cloud-native start-ups can leapfrog traditional IT architectures and jump straight into the cloud, enterprises cannot. Investments in equipment, software, personnel, and data center facilities cannot simply be written off in favor of ‘asset light’ public cloud services.

Build a Roadmap to Cloud Transformation

The reality is that building and managing a hybrid cloud is not as simple as sometimes portrayed. For one thing, the migration path involves a lot more than just deploying a private cloud and linking to a public service. Even the initial assessment phase involves multiple tasks like evaluating changes to existing infrastructure, determining appropriate public resources and setting the proper migration parameters.

There’s also more to hybrid cloud than just interconnection and movement of workloads. Onsite and in-the-cloud systems must be orchestrated to work together as a single system, and workloads should be able to run in the public cloud domain without rewriting the application code or redesigning the network architecture, security policies, or business logic.

Enterprise IT leaders also face the increased challenges of managing enterprise governance, security and risk management associated with unsanctioned consumer apps creeping into the workplace, or supporting the myriad of devices that come with BYOD.  Compatibility and interoperability issues come into play as well, given that the public and private components of the hybrid solution feature different architectures, tool sets, management frameworks, and service catalogs. Disparities among these elements must be abstracted away just to ensure proper migration — not to mention seamless “inside” and “outside” application development and execution environments.

Dynamically Orchestrate, Manage and Scale

The hybrid cloud can provide self-service ‘stretch’ resources that augment what’s already in the enterprise data center, delaying or even eliminating the need for additional CAPEX in favor of an OPEX approach. Enterprises need to be able to quickly, seamlessly and securely, move workloads, including those that are mission-critical, to and from the public cloud.

Hybrid cloud offers a way to satisfy the resource requirements of line-of-business managers, application developers, and IT operations personnel, with on-demand access, self-service rapid provisioning, and scalability – combined with the need to maintain a secure, reliable, policy-driven IT infrastructure.

Complex Hybrid Graphic

Putting the “Hybrid” in Hybrid Cloud
For enterprises, getting the most out of a hybrid cloud implementation will take more than just swiping a credit card to order some cloud virtual machines.  For better or worse, by focusing primarily on the distinct infrastructures which form the hybrid cloud, simple definitions of the concept serve to obscure both the power and challenge inherent in enterprise hybrid cloud.

It is, in fact, the “hybridization”—really, the integration—of distinct cloud infrastructures, rather than simply the distinctness of those infrastructures, that allows enterprises to see benefits or return on investment.

A true hybrid cloud provides an integrated operational and user experience that allows enterprises to maximize benefits such as speed and agility, reliability, utilization, capacity and cost reduction.

In order to deliver a true hybrid cloud, businesses must consider how they will integrate disparate infrastructures while providing a common approach to meeting the security, governance, and administrative challenges that such an environment presents

As a leading provider of enterprise-grade cloud software, infrastructure and services, Virtustream is uniquely positioned for hybrid cloud computing that meets the needs of businesses running large and complex applications, including SAP, Oracle and other legacy systems.

Download our Hybrid Cloud whitepaper and find out how Virtustream can help you manage and orchestrate a hybrid cloud transformation.

About the author

Chris Hale (Krystle Waters's conflicted copy 2015-04-18)

Chris Hale is Virtustream’s Vice President, Technical Marketing.

 



Posted in Uncategorized | Leave a comment

Virtustream Launches First Annual Cloud Conference in Lithuania


Picture1

Virtustream held its first annual cloud conference in Lithuania at the Kaunas University of Technology (KTU), on Thursday, March 26th. The event offered local businesses and students the chance to understand the security, compliance, performance and efficiency requirements needed to migrate and manage the most complex applications across hybrid, private or public cloud environments.

Van Williams, CIO and Chief of Product Engineering at Virtustream and the host of the conference, said, “This conference further emphasizes Virtustream’s commitment to the Lithuanian technology industry.  It is a great chance to build on our relationship with KTU. It is also a chance to show local businesses and students how the work that we are doing in Lithuania can greatly benefit them and offer exciting career prospects in the cloud computing industry.”

In addition to the relationship with KTU, Virtustream hosts its research and technological development center in Kaunas, employing IT specialists to develop future releases of xStream, the company’s cloud management platform software. The event drew on the latest customer and partner experiences of the Virtustream team and encouraged local businesses and students to work in the cloud.

Some highlights included Kelly Bryant, Vice President of Product Management at Virtustream, kicking off the event with a presentation charting the beginnings and evolution of cloud computing, then later hosting a session focused on building a career in the cloud computing industry. Peter Jaeger, Senior Vice President of Product Management at Virtustream, reviewed the technicalities of moving an enterprise to the cloud and how businesses can take full advantage of rapidly evolving cloud technologies.

The event concluded with an expert panel comprised of industry leaders discussing the current cloud computing landscape. Van Williams moderated the discussion. Panel members included Robertas Balkys, Head of Research and Innovations Technology at TEO LT; Alasdair Hodge, Development Manager and CEO at Cloudsoft; Kęstas Liaugminas, Head of Product Development at Blue Bridge; Mindaugas Pranskevičius, CEO at Baltnetos Komunikacijos; and Donatas Zaveckas, CEO of BTT Group.

Please visit our Facebook Album to see photos of the event.

 



Posted in Uncategorized | Leave a comment

The Road to SAP HANA in the Cloud: Best Practices for CIOs


Matt Clemente_2015

 

 

 

 by Matt Clemente, Senior Vice President of Cloud Cover Services at Virtustream ,and founder/creator of the SAP HANA Migration and management factory methodology.

SAP-HANA-logoSAP has come a long way with its SAP HANA offering, touting the ability to significantly improve enterprise operations via more effective business operations, improved efficiency, and previously unachievable speed via the “in-memory” database technology. To put this in perspective, consider that 2 years ago, SAP HANA was really only relevant and popular for business reporting usage. Over the past 24 months, it has become mainstream for use across nearly any industry verticals, and widely adopted by any companies running the core SAP Suite, as well as Business Objects, SAP HANA Live, and HADOOP integration as businesses strive to understand and profit from the real time processing, and real time large quantities of business data running through their systems. More and more organizations are looking to deploy SAP HANA in the cloud for its scalability and ease-of-use as they transition to cloud deployments. To do so effectively, though, CIOs must consider several factors to ensure a seamless migration path and truly capitalize on and expedite the tangible business benefits that SAP HANA can deliver.

While organizations from across the board are increasing their investment and roadmap into the cloud for its simplicity, flexibility and cost savings potential, the migration process can be anything but simple. Deploying and running SAP HANA in the cloud is particularly complex. It requires a strong understanding of the deployment process and should involve a partner that has experience with standing up SAP HANA POC’s. The provider must also have deep SAP HANA experience and understand the challenges and potential project show “stoppers/gotchas” that may arise throughout the installation or migration process. A team with a solid background around proper tuning and who follows OS, DB, SAP (and where applicable) Virtualization best practices, businesses can achieve a 20 to 40 percent gain in performance from their SAP HANA deployments.

Questions CIOs Must Ask To Maximize and Prepare for a SAP HANA Deployments

Before even going down the road of an SAP HANA migration, it’s important to understand the solution’s full capabilities and roadmap the technologies capabilities with the specific needs and priorities of the business. When correctly aligned, the SAP HANA technology enables near real-time reporting and data delivery at a scale never before achievable. This was very evident during a recent Go-Live when end users claimed report functions were not working when in reality they were so used to the 2-5 minute processing time that when the data was delivered in milliseconds they just assumed there program or query was not working properly!

Creating a thorough blueprint of your business model gives you visibility into your organization’s needs, and should be the first step in any SAP HANA deployment. You can analyze how hard month-ends and year-ends are, see where your business is being hit the hardest, and pinpoint how SAP HANA’s intelligence and analytics can be used to improve your efficiency. For example, a global food manufacturer struggled to integrate disparate sources of data running across its high number of on-premises business intelligence platforms. By mapping out these various data sources and deploying SAP HANA in the cloud, they gained granular, real-time visibility into their sales data, allowing them to address problem areas and boost sales.

Other challenges organizations typically face when looking at migrating to SAP HANA, or implementing SAP HANA is finding a simplified migration path. SAP provides some starting tools and best practices “cookbooks”, but there is no standardization and optimization when it comes to deploying SAP HANA in the cloud. Furthermore, any third party software must be reviewed for compatibility with SAP HANA for a successful deployment. The key to overcoming these challenges comes down to working with a service provider that has true SAP HANA enterprise implementation, migration, and ongoing management experience. The investment will pay itself in dividends, as it will expedite the implementation timeline, mitigate the potential risks to your projects success, and enable the best use of the robust SAP HANA “in memory” technology.

After implementation, the SAP journey isn’t quite over – and IT must be on tap to monitor and maintain the solution. This is commonly overlooked, and takes a partner or service provider with proven Enterprise management experience in maintaining SAP HANA including: dealing with revision upgrades, SAP HANA DB refreshes, and understanding of the various methods in which SAP HANA High availability can be used to minimize downtime to give a few examples.

Once SAP HANA is in place, the ability to obtain business level results in actual real time is invaluable, and the return on the investment is demonstrated almost immediately. Typically businesses on average will see operations such as Month-end processing, Year-end processing, and Payroll run 45-75% faster, and average core reporting, transactions, and batch jobs can potentially run 35-1000% faster. The enablement of real time date and live visibility into data drastically changes the business, and enables business leaders to make intelligent, data-driven decisions.

While the migration journey to SAP HANA can seem complicated, with the right business process alignment, and an established and experienced Cloud implementation partner (thus also eliminating the CapEx cost), a significant ROI can be realized almost immediately upon going live. As an example, Global Fortune 500 organizations have migrated to SAP “Powered by HANA” running on Virtustream’s xStream cloud platform and have realized improvements in their productivity right off the bat.



Posted in Uncategorized | Leave a comment

2015: The Year of Real-Time Collaboration, Cloud Globalization and End-to-End Trust


By Kevin Reid
CEO & CTO, Virtustream

As cloud computing adoption rates continue to soar in 2015, IT leaders will look to address the growing challenges of security and compliance, performance and financial control. Coupled with external factors such as the threat of data breaches and pending regulation changes around the globe, the enterprise IT landscape will continue to grow in complexity and opportunity in the year ahead.

predictions

To keep up with these changes, IT leaders will adopt real-time collaboration techniques, geotrust-supported technology, greater transparency with vendors and SLAs, and make strides toward an end-to-end trusted compute platform. Here are four predictions that look at how these issues will pan out in 2015:

  1. Real-time collaboration will address cyber threats: IT leaders will address the growing cyber security threats in the cloud with new and improved real-time collaboration capabilities, particularly in the federal space. Groups will share knowledge to quickly triangulate the source of cyber threats and form a satisfactory action as a joint initiative, rather than spending significant resources to do it independently. Given the massive amount of data breaches and security threats that emerged in 2014, this collaborative approach will provide IT leaders a more efficient way to identify and address threats rather than the siloed, independent processes taking place today.
  1. Global data sharing becomes easier, safer: Along with the growing pains of scaling cloud infrastructure for a global business comes the inevitable question of how to protect data across borders. Enterprises must adopt technology that supports the restriction of data or applications moving into territories that are forbidden, based on security or compliance requirements. This concept, called “geotrust,” will continue to proliferate among enterprises as countries promote an increasing amount of privacy laws and concerns.
  1. Greater demand for visibility and transparency: As new technology is adopted, IT leaders require greater visibility into security, performance and cost when it comes to managing these tools. Service level agreements (SLAs) and vendors must provide greater transparency into average and expected costs in order for IT to gain back financial control. This will also allow for planning and accounting for infrastructure changes, new resource requirements or the ability to quickly scale an app on-demand via consumption-based models that eradicate the fear of a shocking bill at the end of the month.
  1. The end-to-end trusted compute platform: To date, most companies have focused on the security of the data center and the applications/infrastructure it runs on. The reality is applications and networks are being accessed by a myriad of devices. In the Bring Your Own Device era, we’ll begin to see more emphasis on creating secure connections and assuring the integrity of the client side (smartphone, PC, etc). Multi-factor authentication and security between the device and the data center/applications that talk to that device will support the vision of a trusted, end-to-end compute platform.

New trends around BYOD, real-time collaboration and the globalization of the cloud will surface and define cloud computing in 2015. Successful IT leaders will address these trends by responding with a collaborative approach to security and compliance, a greater eye toward performance, and an increase in transparency to better manage these new tools.



Posted in Uncategorized | Leave a comment

The Misadventures of Cloud Computing: When Reliability Matters


Virtustream.highres.referral

When Reliability Matters

We love funny cat pictures just as much as the next person- and our family vacation albums are among our most valued possessions- but those are personal priorities. We generally think about business priorities in a different light, with minimal overlap with the likes of Mr. Whiskers. Keeping personal and business priorities largely separate is a natural and understandable inclination. Given that, why should your cloud service provider be any different?

The consumer cloud works exceptionally well for just that – consumers. But when it comes to the enterprise, a cloud solution is about much more than just data storage. It’s about what you are storing and how you are using it. It’s about backup reliability and business continuity. The lack of adequate security or an ERM strategy with continuous threat monitoring results in consequences far greater than data loss for an Enterprise. It can seriously impact customers and have a direct impact on corporate welfare. Simply put, enterprise-class data and information needs an enterprise-class cloud infrastructure to support and protect it.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters

 



Posted in Uncategorized | Leave a comment

The Misadventures of Cloud Computing: When Security Matters


Virtustream.web.enterprise

Fathers and sons. Start-ups and enterprises. Will they ever understand each other? As it is in life, when companies grow and mature they become responsible for more sensitive information, operate under higher stakes and restrictions and require more complex services to run their businesses and keep them at peak performance.

While he doesn’t see it, the alpha developer and his enterprise-IT  dad, though they are both in the same field technically, have vastly different needs. The advantages a web-scale solution brings to a host of use cases including a video-game start-up are very real, but the cloud is not one size fits all. There is a complexity that comes with the legacy mission critical apps and sensitive data inherent in an enterprise. Albeit not sexy, security, compliance, and reliability are non-negotiable for the Enterprise customer.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Cloud Computing, GRC, Mission-Critical Applications, Private Cloud, Public Cloud, SAP, SAP HANA, Security, Trust, Trusted Cloud | Leave a comment

The Misadventures of Cloud Computing: When Reliability Matters


Virtustream.web.backupplan copy

When Reliability Matters

While no cloud service provider can claim 100 percent uptime – not all cloud services are created equal when it comes to performance and reliability.

The issue of reliability becomes exponentially more important for large enterprises that are migrating their mission critical business applications to the cloud. For these organizations, downtime is not an option.

Virtustream is independently recognized for its commitment to assuring our enterprise customers the highest possible levels of reliability and performance. According to a recent report by independent analyst Cloud Spectator, we outperformed 14 of the leading infrastructure-as-a-service (IaaS) cloud providers worldwide, as defined by the 2013 Gartner IaaS Magic Quadrant.

While reliability is of the utmost importance, recovery is a close second. That’s why we have aggressive recovery point and recovery time objectives to ensure that if something does go wrong, downtime is at an absolute minimum.

Your recovery plan should be more than bemoaning the latest instance of downtime to your Twitter followers. It’s about reliability and service.  Because in the enterprise, we know if you wait – more than just ice cream will start to melt down.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Backup, Cloud Computing, Cloud Spectator, Disaster Recovery, Enterprise Workload, Hybrid Cloud | Leave a comment

The Misadventures of Cloud Computing: When Performance Matters


Virtustream.highres.whiteboard

When Performance Matters

We know as well as anyone that the cloud solutions market is crowded, and getting more so every day. Nowadays everyone seems to be a cloud provider; everyone has a solution they want to sell you, from legacy hardware providers with “platinum new cloud capabilities” to purists who pressure you to adopt “real” scalable cloud technologies.

But buyer beware: Not all cloud service providers are created equal. Many can generalize solutions or pontificate idealisms, but do they really have the right software and services for your particular enterprise needs?

Enterprise cloud users require a much more sophisticated level of performance than a personal user or smaller business. In an enterprise, the data is sensitive, and the stakes are high. It’s vital that a cloud provider that considers themselves “enterprise grade” can guarantee a high level of performance and availability and provides the SLAs to back it up. If its mission critical apps do not consistently run well in the cloud, any other promises just don’t matter.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Backup, Cloud Computing, Compliance, Disaster Recovery, Enterprise Workload, Mission-Critical Applications, On-Demand, Performance | Leave a comment

The Misadventures of Cloud Computing


We’re excited to debut the first illustration in a five-part cartoon series by Tom Fishburne, “The Marketoonist.” With some techie humor and a touch of irreverence, our “Misadventures of Cloud Computing” series sheds light on the day-to-day challenges facing CIOs and IT leadership teams as they navigate the complex enterprise cloud landscape.

We’ll be unveiling a new cartoon every Wednesday for the next five weeks that puts a comical spin on what really matters when selecting an enterprise cloud solution – security, reliability and performance. We hope you check back in regularly for a midday chuckle and we encourage you to share your perspective and experiences on each cartoon’s theme.

Virtustream.highres.server

When Reliability Matters

Server Huggers.  We all know them – the folks that are hesitant to say goodbye to something they can touch and feel, physical servers, for something distant and and intangible. And while it is not actually about “where to put the coffee maker,” cloud reluctance is usually an emotional reaction. Change can be unsettling.

At first blush, it makes sense. Enterprise IT departments manage complex landscapes and moving complicated, mission-critical legacy apps to the cloud is no small feat. And the thought of experiencing any downtime during the transition is a disconcerting one. Often times the stress and complexity of the transition can be misinterpreted as an aversion to the cloud all together.

But transitioning your enterprise to the cloud, even in the most complicated instances, can be a smooth, secure ride if you have the right partners on board to lead you through the journey. And while some IT departments feel like the servers they can see and touch are more safe or dependable than ones they can’t, both security and reliability are fundamental to enterprise-grade cloud service providers who offer continuous enterprise-wide monitoring on a large scale. They have a big stake in ensuring that your data remains safe and you experience zero downtime during and after the move to the cloud.

While it may be counterintuitive at first, moving to the cloud helps enterprise IT gain control of their systems and data, not the other way around. When less time and money is spent managing hardware and day-to-day upkeep, IT can put more resources into pursuing interesting projects that could make a significant impact on the business.

For more information, check out our LinkedIn page.

Want to see more? Take a look at our other marketoons!

When Reliability Matters: Part 1
When Reliability Matters: Part 2
When Reliability Matters: Part 3
When Security Matters
When Performance Matters


Posted in Business-Critical Applications, Cloud Computing, Cloud Environments, Cloud O/S, Enterprise Workload, Hybrid Cloud, On-Demand, Performance, Private Cloud, Security, Trust, Trusted Cloud, Uncategorized | Leave a comment

The Shift Away From Managing Security by Compliance to Managing Security by Risk


Kaus2

By Kaus Phaltankar, Chief Sales Officer, Enterprise Risk Management & President, Virtustream Security Solutions

Every day, enterprises are faced with new and constantly evolving threat vectors. The bad actors have to get it right only once, while enterprises have to defend themselves continuously and get it right every time!

Enterprise Risk Management (ERM) has emerged as an important business trend that builds a holistic and integrated approach to risk management across the enterprise. ERM encompasses a number of risk areas, including Information Technology (IT) Risk, Operational Risk, Regulatory Compliance Risk, Financial Risk and Reputational Risk.

The ERM risk areas are affected by the security within the enterprise. There is a paradigm shift within enterprises to move away from ‘managing security by compliance to managing security by risk.’ This moves the organization to manage security, Information Assurance (IA) or compliance from a discreet snap-shot-in-time ‘checklist-based’ approach to a real-time ‘continuous risk management’ based approach. ERM provides stakeholders and decision makers a risk view across the enterprise with detailed risk by departments, bureaus or information systems. The risk view is available on-demand, as well as over a period of time, to see how the Information System owners and leaders are managing risks.

This risk-based approach requires quantification of risks identified by various security and compliance tools monitoring hardware or software assets to business critical applications. The National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) defines the point security technologies, such as Asset Management Systems, Configuration Management tools, Vulnerability Scanners, Security Information Event Management (SIEM) systems, as well as Governance, Risk and Compliance (GRC) tools as ‘Sensors.’ The Department of Homeland Security (DHS) Continuous Asset Evaluation, Situational Awareness and Risk Scoring (CAESARS) framework refers to the sensors in its ‘Sensor Sub-system’ for collecting data about enterprise asset risks, which are then analyzed and quantified using risk scoring algorithms. These risks per assets are aggregated by Information System(s) and by mission critical functions supported by those information systems. The critical emphasis is on continuous risk monitoring that utilizes automation using machine-to-machine information exchange through standards-based protocols such as NIST Secure Content Automation Protocols (SCAP) on a continuous basis. The continuous nature of the analysis allows information system owners to quickly assess risk on a near real-time basis and become proactive in mitigating risks based using a prioritized response.

Regulatory requirements, such as GLBA, SOX, HIPAA, PCI or the Federal Information Security Management Act (FISMA) also mandate ‘continuous monitoring’ of information systems for protecting Personally Identifiable Information (PII) using Privacy Impact Assessment (PIA), as well as monitoring of unauthorized access to maintain the integrity of data and systems.

The biggest challenge for enterprises today is to truly understand what it means to conduct continuous risk monitoring and what that entails. The key requirements of continuous risk monitoring under RMF are:

  • Close tracking of change and configuration management of assets
  • Monitoring of Information Assurance (IA) and governance controls using automated tools
  • Quantifying risk based on risk algorithms and computations
  • Document creation, updates and reporting

Enterprises need a scalable data ingest, collection, storage, processing platform that delivers an accurate and timely view of the IT and operational risks by providing a 360° view of the asset’s software and hardware inventory, vulnerabilities (VUL) and secure baseline compliance established on approved baselines. Other key attributes of the Continuous Risk Management platform are:

  1. Scalability: The continuous nature of data collection automatically imposes scalability requirements. Scalability can be imposed across data storage per size, frequency, and retention metrics. Additionally, the platform needs to demonstrate data collection flexibility and analysis scalability. It must also be deployed stand-alone or setup in a tiered architecture to accommodate distributed enterprise implementation.
  2. Agnostic Sensor Coverage: The data ingest should be configurable to support multiple sensors based on current point technologies, as well as a multitude of data inputs such as NIST SCAP, XML, JSON, CSV, and other input formats.
  3. Creating Common Operational View: The solution needs to provide a singular view of the risk by asset, by application, by department, by agency or by enterprise as a whole. This view needs to be on a single pane of glass with full drill-down and drill-back capability to view what data contributed to the overall risk at various levels of the dashboard view.
  4. Data Warehousing: A platform should provide options for data warehousing based on volume of data from SQL to a NoSQL Big Data solution. The SQL-based RBDMS data storage (e.g., MS SQL, ORACLE, DB2) provides a traditional data store for structured data, while a NoSQL Hadoop solution offers a store for structured and unstructured data with a linear store and processing scalability using a multi-node cluster solution.
  5. Monitoring Management/Work Flow Capabilities: Enterprises need a solution that controls all workflow requirements in terms of risk monitoring, compliance with baseline specifications, as well as response metrics and related mitigation steps. Built-in workflow managers should be customizable to map business processes, assign risk values and trigger alerts for mitigation actions.

At Virtustream, we offer our Viewtrust software for Enterprise Risk Management, Cyber Situational Awareness and Regulatory Compliance monitoring on a continuous basis. The Continuous Risk and Compliance Monitoring (CRCM) capability provides:

  • Proactive risk management using a standards-based risk management framework
  • Continuous monitoring of each asset for compliance and risk by building a 360° view of each asset within the enterprise using multitude of sensor data
  • Massive processing capability using Hadoop Big Data solutions to process volumes of data in a variety of formats quickly (volume, variety and velocity). The data could be structured or unstructured.
  • Perform threat and impact analysis using external threat intelligence from US-CERT or commercial inputs, asset configuration policy and hardening guides such as Security Technical Implementation Guides (STIGs)
  • Business Intelligence (BI) analytics and reporting capability for analyzing massive amounts of data for diagnostics and prioritizing mitigations
  • Enabling near-automated mitigation by interfacing with other tools and technologies

The Viewtrust Continuous Risk Monitoring approach truly enables Enterprise Risk Management and allows enterprises to shift away from ‘managing security by compliance to managing security by risk.



Posted in Uncategorized | Leave a comment

Forecast: Cloudy With a Chance of Encryption


encryption

By Tom Evgey, Sr. Security Analyst, Virtustream

The RSA conference in San Francisco, CA, is certainly one, if not the largest network security conferences in the world. Rows upon rows of vendors attend showcasing the latest and hottest tools and services in the industry in an effort to convince you to buy their ultimate solution for protecting your closest assets.  Buzz-words like SIEM integration, log correlation, cloud security and forensic analysis were on almost every flyer; long lines of the security field’s finest, crowded around in search of that one solution that will put all of us “enemy of the state” enthusiasts at ease.

Perhaps, as we each head back filled with promotional material and swag, we’ll take a closer look at some of the capabilities that each of the vendors had to offer. And hopefully, we’ll find something we can bring back to our own environment and integrate it into our own security mission in an effort to proactively prevent the next big hack.

Securing The Perimeter

The common strategy has long been protecting our assets from the outside in. We  utilize firewalls, Intrusion Detection System (IDS), proxy, maybe a Web Application Firewall (WAF) and  Anti-Virus (AV), all perimeter based.

Perhaps if we’re really paranoid, we install a host-based IDS. We run our weekly or monthly scans, we have a log correlation system to collect our logs, we have HTTPS running and OTP protecting ALL of our applications, and, every once in a while, we test our ‘Incident Response Plan’ to check our readiness.  But is that really the correct approach? I don’t think anyone would argue that the companies who have been breached in the last few months were lacking any of the above methods. Still, we hear about these large-scale attacks, week after week, with a cloudy forecast for more attacks. So what ARE we doing wrong?

Our Challenge

As a Cloud Service Provider, we have a unique challenge at Virtustream.  We manage massive amounts of customer data. Our clients are all using our processor power, disk I/O, network packets and memory resources and their consumption gets reported on by the minute, based on need and environment, and all while maintaining Confidentiality, Integrity and Availability (CIA) and staying current with every compliance framework. We need to find the right balance of protecting our customer data and Intellectual Property (IP), while providing accessibility to our user base… all at a very competitive price.  It starts with a simple login to our web portal all the way to large scale cloud bursting.

Our Solution   

So, how do we protect our data? What is the missing component in our security strategy? How do we maintain the integrity of our data without losing that accessibly?  If you guessed “getting another scanner or IDS with every bleeding edge signature” or yet another firewall you should reconsider (and maybe take another quick glance at the article’s title).

The answer is encryption. Encrypting our data ensures WE have the keys to our assets, no matter where the data goes.

Here at Virtustream, we leverage the capabilities of Vormetric Encryption. Not only is the data encrypted and protected from any inbound threats, Vormetric takes ‘access as needed’ to another level. As part of a policy, it can restrict any user from accessing the data, by way of encryption. The prompt of ‘access denied’ now has a new meaning.

Final words

Encryption of the data is a must in any environment. It will be the one tool that will save the day when all other tools have failed. It will save the day when the marketing manager accepts a ‘LinkedIn’ invite which redirects to an HTTPS site hosting malware. It will save the day again when someone in finance opens a PDF file containing embedded malicious code. All of these scenarios, as I’m sure you know, happen every day. Our weakest links are always internal. Who does port knocking anymore, or IDS evasion, or tries to bypass the firewall when a simple, cheap email with a scandalous or believable subject line is all it takes to get cracked wide open?  Risks mostly come from inside and encryption can prevent the access to data, even in the case of a breach.

We live in a fast evolving environment, where everything is changing rapidly and we try diligently to stay one step ahead of the attackers. The challenge is that we have to be right EVERY time, while they only have to be right ONCE.

So, my suggestion to you is to brighten up your cloud forecast with some encryption.  You’ll be glad you did!



Posted in Cloud Computing, Encryption, GRC, Intel TXT, Mission-Critical Applications, Security, Uncategorized | Leave a comment