Enterprise Featured

Will Hybrid IT Eclipse The Public Cloud?

Solar eclipse with stars on background. Vector sun illustration for design

Technology continues to move forward at a rapid pace

What seems innovative and exciting today can quickly become obsolete tomorrow. For CIOs and those who select and implement IT infrastructure, these technologic advances create a myriad of opportunities.

As you know, public cloud is getting lots of attention — with some believing that it is the only path forward for businesses. But is public cloud still delivering all the IT advantages it promised?

As we have seen, many businesses have already started to move beyond the public cloud into a new era of hybrid IT that combines public cloud, private cloud, and traditional IT. Is it possible that innovations such as hyperconvergence, containers, composable infrastructure and Azure Stack (as a private cloud) are already pushing us to think about how to manage infrastructure differently?

Accelerating deployments beyond public cloud

As I travel the world and talk with customers, I am seeing successful companies move toward a hybrid IT model because it lets them deploy applications into the IT environment that best suits a variety of specific workloads.

It’s clear that we haven’t arrived at a hybrid IT utopia yet, but the pieces are quickly falling into place. To achieve the goal of a simple hybrid IT infrastructure, the following advancements are needed.

Software-defined management to tie it all together

Industry-leading companies, like Hewlett Packard Enterprise, are currently working on software that will allow businesses to build and manage applications in one environment and seamlessly move them to another. A comprehensive, hybrid IT software platform will allow organizations to compose, operate and optimize workloads across on-premises, private, hosted and public clouds.

Build on existing infrastructure and technology

As hybrid IT takes hold, businesses aren’t rushing to throw away their current infrastructure. Instead, they are building on it as they transition to new technologies. This transition is imperative as tried-and-true solutions such as VMware, OpenStack and Azure Stack need to work seamlessly with legacy on-premises platforms, container-based platforms and private-cloud stacks.

Cloud-like, software-defined platforms on premises

New offerings such as hyperconverged and composable infrastructures offer cloud-like capabilities on-premises — solutions that can provide businesses more control, greater performance, less cost  and less risk than many public cloud options. A combination of on-premises, software-defined options within a private cloud seamlessly combined with public cloud lets businesses build the best possible infrastructure for their individual workloads.

Marketplace for hybrid IT

Another must-have for a simple hybrid IT environment is an easy to use marketplace so that businesses, IT operators and developers can quickly access the tools they need. For example, Cloud 28+ is the world’s largest independent community that links customers with a global, open partner ecosystem of cloud service providers. Today, Cloud28+ features more than 600 HPE partners (including service providers, ISVs, value-added resellers, distributors and systems integrators.)

The importance of analytics

An analytics-powered dashboard is needed to provide visibility on costs and utilization across private and public infrastructure — with insights down to the level of individual projects. These types of analytics will help businesses keep costs under control and ensure service levels are being met. For example, Cloud Cruiser lets businesses measure, analyze, optimize and control their usage and spend in private, public and hybrid IT environments.

Room to grow

As technology continues its rapid race forward, a simple hybrid IT infrastructure must be able to easily evolve. For example, APIs are currently simplifying data integration, and microservices that work together seamlessly are today’s ideal. As tomorrow brings memory-driven computing and edge servers that support IoT, a hybrid IT infrastructure will need to evolve to ensure simplicity and choice.

Putting it all together

Will the right mix of hybrid IT — public cloud, private cloud and traditional IT — soon replace public cloud as the future of IT for enterprises? From what I’ve seen, my answer is YES.


About Gary Thome

Will Hybrid IT Eclipse The Public Cloud? TechNativeGary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog
Will Hybrid IT Eclipse The Public Cloud? TechNative

Finding your right mix of infrastructure with on-premises IT & public cloud

office building at night

What’s your best cloud strategy?

We make thousands of decisions every day — 35,000+, if you believe internet sources. If you are a corporate executive or an IT manager, one of the bigger decisions you may be struggling with is whether public cloud is right for your business. And if it is, which applications should you deploy in the public cloud? Which ones should stay on premises?

And what about private cloud and newer technologies you may be hearing about, like hyperconverged and composable infrastructure?

Identifying the best of both public and private clouds

Determining a cloud strategy is a complex decision, one that takes a great deal of research and planning. The good news is that you don’t have to make an either/or decision in terms of public vs. private cloud. Enterprises are moving forward with a variety of cloud strategies, looking for IT solutions that best meet the needs of their workloads.

According to CIO.com, a recent survey by Forrester Research reported, “38% of enterprise decision-makers said they are building private clouds, with 32% procuring public cloud services and the remainder planning to implement some form of cloud technology this year. The hybrid cloud is also heating up, with 59% of respondents saying they are adopting the model.”

Finding your right mix of infrastructure with on-premises IT & public cloud TechNative
©pressmaster – Fotolia

Organizations are also deploying those new technologies, hyperconverged and composable infrastructure, because they offer cloud-like capabilities on premises. In addition, hyperconverged and composable infrastructure can provide more control, greater performance, less cost and less risk than many public cloud options. That is why many companies are pulling back from public cloud, and why many more are thinking twice before jumping on board.

Finding the right mix in a simpler hybrid IT environment

As businesses research their hybrid IT options, it’s important for them to take the time to determine their right mix of private, public and traditional IT — what works best for their specific business and workloads. This kind of in-depth analysis can be difficult, so many businesses seek expert advice. A paper published by 451 Research last year, Best Practices for Workload Placement in a hybrid IT Environment, delves into this complex issue, offering insights and recommendations.

Once you have determined your right mix, what comes next in the complex world of hybrid IT? Hewlett Packard Enterprise (HPE) is working on a new innovation that will help simplify hybrid IT deployments even further. It’s a new hybrid IT stack initiative that will allow customers to seamlessly compose, operate and optimize the right mix of public, private, hosted and on-premises clouds. Using a single management dashboard with built-in analytics and controls, application developers and IT operators will be able to simply click, compose and consume resources across their hybrid estate. This integrated cloud management and infrastructure solution will deliver more control, lower cost and higher utilization rates.

More choices, more opportunities

Hybrid IT with the right mix of public cloud, private cloud, and on-premises solutions lets businesses place each application or workload in the IT environment where it is best suited. And on the very near horizon, software advances will allow businesses to build and manage applications in one environment and seamlessly move them to another. This really will optimize and unite the best of what on-premises IT and public cloud offer.


About Gary Thome

Finding your right mix of infrastructure with on-premises IT & public cloud TechNativeGary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog

Finding your right mix of infrastructure with on-premises IT & public cloud TechNative

Four different workloads: financial risk vs. reward of deployment options

f

Have you evaluated the risk?

Investors, shareholders and creditors are all familiar with financial risk — the potential for uncontrolled financial loss and the uncertainty it brings. When it comes to public cloud computing, financial risk also needs to be considered. Because cloud computing shifts IT spending to a pay-as-you-go model (OpEx) instead of paying up front (CapEx), the OpEx model sounds perfect. You get to use what you need when you need it. What could possibly go wrong?

When does it make sense to spend OpEx versus CapEx?

Below I’ve described four business models for deploying different workloads and analyzed the financial risk of each. This information can help you decide which workloads to deploy with a pay-per-use, OpEx model (used in the public cloud) and which workloads work best with a CapEx model (commonly used in traditional on-premises or private cloud).

Workload 1: Revenue generating app in the public cloud

Your developers have written a new app for your customers — one that generates revenue from everyone who uses it. This app is based on a pay-per-use model. As long as the revenue you receive from the app is greater than the cost for putting it in the public cloud (and you receive payment prior to your cloud bill), all is well.

Financial risk = NONE. Congratulations, you have selected wisely! Your revenue-generating app is a winning business model in the public cloud.

Workload 2: Non-revenue generating app

Your developers have created a new app that is freely available for your customers to use. You’re not certain how many customers will take advantage of it, but you’ve made some initial calculations. If only a portion of your customers take advantage of it, you can afford to pay the added costs incurred in the public cloud. But what if your app becomes wildly popular? Since you don’t have any revenue associated with it, you must pay for the added costs from your bottom line. The good news is … your app is very popular. The bad news is … your app is very popular AND costly!

Financial risk = HIGH. Tread carefully. Your non-revenue generating app running in the public cloud may just put you out of business!

Four different workloads: financial risk vs. reward of deployment options TechNative

Workload 3: Customer support app in the cloud

Because you know how many customers you have, a customer support app in the public cloud seems like a predictable expense — therefore, the financial risk should be small. You’ve run the numbers and the cost for implementing this workload in the cloud is known and stable. You’ve budgeted the expense.

Unfortunately, you just found out that sales missed expectations for the previous quarter and your CFO has determined each department must cut 20%. That leaves you in a bind, since you can’t cut 20% of your customers. If your workload was in a private cloud or on-premises, you could easily save 20% by putting off a budgeted tech refresh. Not so in the public cloud, since this cost is fixed.

Financial risk = MEDIUM to HIGH. You should probably start looking around for other ways to cut costs.

Workload 4: Workload with known demand

Spending CapEx dollars for IT when you are unsure of what you need isn’t smart business. Yet, the opposite is also true. When you have a workload with known demand, why would you overpay to put it in the public cloud?

As I mentioned in a previous article, it’s likely cheaper and faster to rent capacity in a cloud than to build and own it yourself. But just like renting a house, it becomes more expensive over time and you never own anything. If your workload is committed to run at a certain level over a long period, you may actually be wasting money in the public cloud — money that could be used to implement a better solution for your enterprise.

Since you already know how much you will need to spend to run the workload, why not invest that money in your own infrastructure instead of paying a premium to someone else? At the end of the app’s lifecycle, you actually may have paid two to five times as much in OpEx compared to if you had bought the equipment upfront with CapEx costs.

Financial risk = MEDIUM to HIGH. Although this decision isn’t a devastating financial risk for your business, consider looking to private cloud for a smarter long-term strategy.

Less financial risk combined with cloud-like benefits — all in-house

In three out of the four scenarios I’ve discussed, application or workload deployment on-premises or in a private cloud involved lower levels of financial risk. In the past, building a private cloud may have been difficult, but with today’s new composable and hyperconverged infrastructures, it has never been easier. You can have all of the benefits of public cloud in your own datacenter — benefits such as simple integrations, subscription payment models and elastic capacity. And for those who want an OpEx model without the financial risk, that flexibility is available on premises. Lastly, as outlined in a 451 Research 2016 report, the private cloud model often comes with a lower total cost of ownership.

Every business needs to take stock of their applications and determine the financial risk of deploying in the public cloud versus deploying on traditional IT or in a private cloud.  It’s time to mitigate your financial risk by exploring which of your key workloads and apps should be moved out of the public cloud.


About Gary Thome

Four different workloads: financial risk vs. reward of deployment options TechNativeGary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog

Four different workloads: financial risk vs. reward of deployment options TechNative

Why Businesses Are Exiting the Public Cloud

bigstock–167241872

The Great Cloud Exodus

When I first started hearing about companies that were born in the public cloud and are now leaving, I sensed an era coming to an end, or at least turning a corner. As a trend, public cloud launched many enterprises into the virtual IT realm, quickly and suddenly. Now, I wondered how prevalent this migration back was becoming.

According to an article in Forbes published last year, a study conducted by 451 Research found that 20% of cloud users had moved one or more of their workloads from the public cloud to a private cloud. They even have a name for this migration – cloud repatriation.

The study also found that an additional 10% of public cloud users were planning to do the same thing in the future — move some workloads from public to private cloud. In total, those who were embarking on cloud repatriation planned to do so with 40% of their current public cloud workloads.

Meg Whitman, president and CEO of Hewlett Packard Enterprise, is quoted as saying, “According to IDC, 53% of enterprises have or are considering bringing their workloads back on-premises, and I am willing to bet that percentage is going to increase.”[[1] These stats point to a dramatic shift in thinking from believing that public cloud is the “be all and end all” to “hmmm, maybe we need to slow down and consider developing a hybrid IT strategy, placing applications where they fit best.”

Public to private cloud — what’s behind the move?

What is causing this repatriation or migration from public to private clouds? The Forbes article I mentioned above quotes Andrew Reichman, cloud research director at the 451 Group, as saying, “Although security is one of the main drivers, greater control, cost, availability, and IT centralization are all a part of the repatriation of applications into the private cloud.”

I agree, the public cloud has strengths. Yet, as I’ve shown in previous articles, it is far from perfect for all apps.

Both big and small companies are leaving the public cloud

The biggest name in the cloud repatriation story is Dropbox. Although Dropbox was actually born in the cloud — originally started on AWS — it has since moved more than 600 petabytes of data from Amazon’s cloud to its own data centers. Dropbox says that the move has given them faster performance and lower costs.

Another well-known company that has left the public cloud and now enjoys a hybrid IT approach is TapJoy. Although TapJoy still boasts a large AWS footprint, they seem to have found a better balance of flexibility and control with a hybrid IT environment, placing some workloads in a private cloud and some in a public cloud.

The latest business that is moving out of the public cloud is Facebook’s WhatsApp. According to news reports, Facebook plans to leave IBM’s public cloud to bring their service into their own data center.

What about other companies — public, private, or hybrid IT?

CRN.com documents several sources recounting stories of businesses leaving the public cloud in search of a better, hybrid approach. “We started the company on a public cloud infrastructure, but over time as we grew we needed to move towards a private cloud solution,” said LiquidSky CEO, Ian McLoughlin.

In the same CRN.com article, CB Technologies President Jason Mendenhall said, “These big companies have been in public cloud, they are seeing the challenges and now they are looking at coming back. We’re also seeing lots of opportunities from late adopters who are now thinking they don’t need to go to public cloud. It’s not the end of public cloud. Let’s be very clear about that. It is not an either/or story. It is hybrid.”

In my role as chief technologist for Hewlett Packard Enterprise’s Software-Defined and Cloud Group, I have had the opportunity to speak with many customers about their challenges. At our June HPE Discover 2017 event in Las Vegas, I met with one customer who mentioned that his company had just started to test a public cloud by developing an app there. After only one month, he was shocked to see a bill of several hundred thousand dollars! He commented that for that price, they could have quickly funded an on-premises cloud.

Composable infrastructure leads the way to a hybrid IT environment

Private cloud options are now better than ever due to the introduction of composable infrastructure, which automatically composes resources as needed for any application. Because composable infrastructure gives you the flexibility and economics of the cloud combined with the control and security you get within your own data center, this new technology is quickly becoming the top choice for a hybrid IT environment.

Project New Hybrid IT Stack is a new HPE innovation that is helping customers simplify hybrid IT deployments. The vision of Project New Hybrid IT Stack is to deliver a comprehensive hybrid IT software platform that allows customers to seamlessly compose, operate, and optimize workloads across on-premises, private, hosted, and public clouds.

As businesses evaluate the best IT infrastructure for their workloads, hybrid IT with a mix of public cloud, private cloud and on-premises solutions will become the norm.

[1] IDC (‘Pay-per-Use Models in IaaS Survey,’ July 2016)


About Gary Thome

Why Businesses Are Exiting the Public Cloud TechNativeGary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog.

Why Businesses Are Exiting the Public Cloud TechNative

Is Public Cloud Really Cost-Effective?

fintech icon  on abstract financial technology background .

More and more businesses are moving to the public cloud for at least a portion of their workloads

And why shouldn’t they? It’s an attractive model, with low upfront costs and the speed and agility businesses could only dream about a few short years ago.

Yet, many still have legitimate concerns about public cloud. I’ve touched on a few of these in previous articles — high profile cloud outageslatency issues; and questions regarding control, security, and compliance.

Another nagging concern is cost. Although the upfront costs are enticingly low, the question remains…is public cloud really cost-effective over the long haul? Or are you burning money that could be used to fund a more effective mix of traditional IT, private, and public cloud?

The high cost of accessing your own data in a public cloud

The actual storage fee for housing your data in the public cloud is typically low. But once your data is in there, you must pay to access it, which for many comes as a surprise. That means the more value you get from your data, the more money you have to pay. Although these charges are considered low (in the range of $0.01 to $0.05 per 1,000 transactions) costs can quickly escalate when a customer is using the public cloud as primary storage or for storing any particularly active data set.

Public cloud vendors also charge for a variety of services besides just storing your data. You need to not only be aware of these costs, but also do a cloud storage cost analysis to really understand your costs before moving your data to the public cloud.

Is Public Cloud Really Cost-Effective? TechNative

You’ll also likely encounter data migration costs, which can be substantial. If you have a multi-cloud strategy, as many enterprises do, you’ll see additional costs as you move a workload from one cloud to another — along with any application adjustments that will probably be needed. You’ll also want to ensure that security and governance is covered — as I wrote in my article on data in the public cloud, your data is ultimately your responsibility (and may add additional expense).  And while most vendors don’t charge to bring your data into the cloud, public cloud providers usually charge for data you move out of the cloud.

All of these public cloud costs can add up, and the sticker shock may be more than you bargained for. Remember: Your data is your most important digital asset. Be careful not to make it too expensive to access!

On-premises: cost-competitive with public cloud

With new technologies, such as composable and hyperconverged infrastructure, implementing a private cloud becomes an effective option to deliver a public cloud experience in your datacenter. And it can often come with a lower total cost of ownership, as outlined in a 451 Research 2016 report.

A private cloud model combined with traditional IT are now a critical and proven option in the design and deployment of applications that require enhanced security, availability, and performance — at costs equal to or less than public cloud.

For example, HPE recently modeled three sizes of their private cloud solution on their composable platform, HPE Synergy.  Working with CloudGenera, HPE found that the public cloud was 2-5 times more expensive than HPE Synergy over a 3-year period — that’s like buying the hardware 2-5 times. And if those numbers aren’t enough for you to look twice at all options, remember the added benefits of having your own hardware and software: dedicated performance/storage, complete control over your infrastructure, and increased agility due to new private cloud technologies.

Another popular option today is hyperconvergence. According to a 2016 report by Evaluator Group, HPE SimpliVity offers up to 49% total cost of ownership (TCO) savings over a three year period when compared to Amazon Web Services (AWS). Cost savings in comparison to public cloud is just one reason why hyperconverged infrastructure has been growing so rapidly over the past few years. Other reasons include simplicity and speed of deployment, all while keeping control of your workloads on-premises.

Realistically, pricing can go up or down based on many factors; yet this comparison shows that pricing in the public cloud isn’t always the best deal.

Are you putting your money in the right cloud?

Public cloud providers have succeeded in marketing their services so well that many believe public cloud is more cost-effective than implementing a hybrid IT strategy. In the short term, it is likely cheaper and faster to rent capacity in a cloud than to build and own it yourself.  Over the long haul, however, renting may be the more expensive option, and you never own anything. If your workload is committed to run at a certain level over a longer period, you may actually be burning money in the public cloud — money that could be used to implement a better solution for your enterprise.

This realization begs the question: if you can have your applications and most valuable data on-premises with full control and a cloud-like experience for the same cost or even less, which would you choose.


About Gary Thome

Is Public Cloud Really Cost-Effective? TechNative

Gary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog.

 

Is Public Cloud Really Cost-Effective? TechNative

Your Data in the Public Cloud: Your Responsibility

Artificial neuron in concept of artificial intelligence. Wall-shaped binary codes make transmission lines of pulses, information in an analogy to a microchip.

In today’s digital world, data is the new necessary resource.

And like many resources, data must be gathered and refined to extract value from it. Your business is probably doing just that — gaining a trove of information that helps you become more competitive.

If data is indeed that valuable, it makes sense to be vigilant in protecting it. Yet, when you put your data in the public cloud, have you considered that you may be giving up a fair amount of control? To investigate this claim, consider three key concerns in the public cloudprotection, compliance/data sovereignty and legal issues.

Protecting your data — implicitly trusting your cloud provider

Because data is critical to running a business, it is logical to be actively involved in protecting it. Yet, according to a press release by CTERA, two out of three companies using the public cloud are not focused on backing up their applications at all. Why? Because they believe that the cloud is more resilient than on-premises applications, a belief that facts don’t necessarily support. And a majority of organizations rely solely on their cloud providers to run backups, even though most admit that any loss of data in the cloud would be catastrophic to their business.

Another study by security firm Netskope found that 48% of companies surveyed don’t inspect their applications in the cloud for malware and 12% weren’t sure if they did or not. Of those that do inspect, 57% said that they found malware. According to another report by the same firm this year, 23.8% of malware-infected files were shared with others, including internal or external users, or were even shared publically.

Bottom line

Just because you put your data into the cloud, it doesn’t mean it is protected and secure. It’s still your responsibility to ensure backups are getting done, and your data is being checked for malicious malware.

Compliance and data sovereignty — YOU are responsible, not your cloud provider

One of the biggest problems with maintaining compliance in the cloud is simply knowing where your data is located. During an audit, you need to prove the location of your data along with the measures that are in place to protect it. You also must document the level of access for each user and how these levels are maintained. You can’t just assume that your cloud provider has security controls in place and that they are being used properly.

Your Data in the Public Cloud: Your Responsibility TechNativeRemove term: Cloud Analysis

In late 2015, the 15-year-old Safe Harbor regulations expired — a regulation that made it easier for American businesses to comply with more stringent data protection laws in Europe. Several months later, the US-EU Privacy Shield agreement was signed, which mandated stronger policies. And the General Data Protection Regulation (GDPR) is scheduled to be enacted in 2018, putting in place even stricter mandates along with severe fines for non-compliance.

What do all these changes in data sovereignty mean to public cloud providers and to those who use their services? GDPR makes it more complex and harder to comply if you store your data in the public cloud. And if you think compliance is the cloud provider’s problem and not yours, think again. The business — not the cloud provider — is considered to have primary responsibility. And as of September 2017, only 24.6% of cloud providers were rated “high” in a GSPR-readiness assessment, based on attributes such as location of where data is stored, level of encryption, and data processing agreement specifics.

Bottom line

You must ensure that you are compliant and be able to show auditors this information. Although you can outsource operations to a cloud service provider, you can’t outsource your responsibility.

Legalities — once you move your data, do you really still own it?

The Fourth Amendment was designed to protect U.S. citizens against unreasonable search and seizure. Although the Supreme Court recognized telephone calls as protected (almost 100 years after the telephone’s invention), no such precedent exists for public cloud. And that’s because an exception called the third-party doctrine states that citizens have no expectation of privacy when information is disclosed to a third party such as a public cloud provider.

Currently, the government can search information stored in the cloud without you ever knowing about it. The cloud provider is informed, but a gag order may keep you from ever knowing.  Different countries have different laws, and the legal system appears to be changing.

Bottom line

When you put your data in the public cloud, be aware of where it is being stored and what the laws are that govern it.  Chances are, you are giving up some amount of control.  Although it’s technically still your data, you may not even know if it is being accessed.

It’s your data and your responsibility

Companies are turning to the public cloud for a variety of reasons. Yet, putting all of your data in the public cloud without considering data protection, compliance/sovereignty, and legal issues could lead to some big headaches. And public cloud providers won’t be the ones responsible. Remember, it’s your data — your intellectual property, your analysis, and your competitive advantage!

As I wrote in this article on control over workload placement, businesses need to determine which workloads should be in the public cloud and which ones should remain on traditional IT or a private cloud. Due to new technologies, such as hyperconverged platforms and composable infrastructure, keeping your most valuable data on-premises is now easier, faster and more cost-effective than ever before.


About Gary Thome

Your Data in the Public Cloud: Your Responsibility TechNative

Gary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog.

Your Data in the Public Cloud: Your Responsibility TechNative

Reality strikes: Latency issues plague public cloud

Caution. High voltage.

Technology, like everything else, has trends or cycles

Public cloud started more than 10 years ago and was the hot, new tech trend. But now…are things starting to shift? Are organizations thinking twice before automatically moving essential workloads to the public cloud?

The answer is yes and for a variety of reasons. A few born-in-the-cloud companies have now moved from the public cloud back to on-premises data centers — DropBox is a high-profile example. And public cloud’s performance (or lack thereof) was a big reason why.

Reality check: Public cloud is all about capacity, not performance

When businesses choose to put their applications in the public cloud, they are sharing infrastructure with a lot of other people. Of course, this can be a good solution because it means that you only pay for what you need when you need it. Public cloud also gives businesses the ability to scale up or down based upon demand.

But don’t forget the whole business model of public cloud: time-sharing. The provider is giving everyone a slice of the timeshare pie, which means that the provider is promising capacity — not performance. I am not the first person to highlight this drawback, I just want to reiterate it: yes, public cloud providers do place performance limits on the services they provide.

Of course, for workloads you deploy on-premises, you get to decide what the performance slice should be. Having this choice is imperative for applications that require reduced latency, such as those for big data and financial services.

Are new technologies making data centers new again?

Looking forward, two new technologies are now available that can boost performance for all applications. These technologies are containers and composable infrastructure. Running containers on composable infrastructure can ensure better performance for all applications.

Containers are open source software development platforms that share a common lightweight Linux OS and only keep the different pieces that are unique to that application within the container. This type of OS-level virtualization means you can hold a lot more containers on a particular server compared to virtual machines (VMs).

A big benefit of containers is increased performance. And when you run containers on bare-metal, performance is increased even more! This is because containers running on bare-metal don’t require a hardware emulation layer that separates the applications from the server.

HPE and Docker tested the performance of applications running inside of a single, large VM or directly on top of a Linux® operating system installed on an HPE server. When bare-metal Docker servers were used, performance of CPU-intensive workloads increased up to 46%. For businesses where performance is paramount, these results tell a compelling story.

Yet, some companies have hesitated to move containers out of virtual machines and on to bare-metal because of perceived drawbacks of running containers on bare-metal servers. These drawbacks, such as difficulties with managing physical servers, are definitely relevant when considering yesterday’s data center technologies. Composable infrastructure helps overcome these challenges by making management simple through highly automated operations controlled through software.

Composable infrastructure consists of fluid pools of compute, storage, and fabric that can dynamically self-assemble to meet the needs of an application or workload. These resources are defined in software and controlled programmatically through a unified API, thereby transforming infrastructure into a single line of code that is optimized to the needs of the application.

Because composable infrastructure is so simple to deploy and easy to use, it removes many of the drawbacks you would traditionally encounter when deploying containers on bare-metal. The end result is better performance at lower costs within your own data center. The combination of containers and composable infrastructure is a marriage made in heaven.

A hybrid IT cloud strategy solves the performance problem of public cloud

When considering where to deploy, first consider the performance needs of your application. Then compare those performance needs against the service levels offered by public cloud vendors and what you can deliver on premises. As I wrote in this article about control over workload performance, businesses need to determine which workloads should be in the public cloud and which ones should remain on traditional IT or a private cloud. And thanks to today’s new technologies, containers and composable infrastructure, staying with traditional data-center deployments may just be the better choice.


About Gary Thome

Reality strikes: Latency issues plague public cloud TechNative

Gary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog.

Reality strikes: Latency issues plague public cloud TechNative

Public Cloud Outages: Should You Put All Your Workloads in One Place?

Stars and galaxy space   background,

Gary Thome, VP and Chief Engineer at HPE’s Software-Defined & Cloud Group analyses the recent high-profile public cloud outages

A growing number of businesses are becoming dependent on public cloud for their IT systems. Looking back at several high profile public cloud outages this year, a consistent response emerged. Numerous analysts and authors agreed that maybe businesses shouldn’t have all their workloads in one place. Just maybe, a diversified approach that combines both public and private clouds would work better.

A diversified strategy

After the memorable AWS outage in February, Dana Gardner, Principal Analyst, Interarbor Solutions, stated, “Cloud sourcing is no different than any product or service sourcing. The old adages still apply: Don’t put all your eggs in one basket, and keep your options open.  A private to multi-cloud continuum that can react in real time is and will remain the safest route to nonstop business continuity.”

Not surprisingly, after the outage a few articles defended the public cloud monolith. One such article quoted a prominent analyst who covers public cloud referring to the outage as a “hiccup.” Yet according to another article in Business Insider, the disruption hurt 54 of the top 100 internet retailers when their websites either crashed completely or slowed by 20% or more. Cyence, a startup that analyzes the economic impact of internet risk, had reported that financial services companies in the U.S. lost an estimated $150 million.

So how do you decide? It’s all about control.

The advice to diversify doesn’t really guide anyone’s decision on where to host different workloads with different needs. The outage back on February 28 (last day of the month) was the result of AWS doing some routine maintenance — a simple command. Yet due to human error, simple maintenance caused major issues for many businesses. Before this routine procedure, did AWS contact everyone and say, “Hey, we’re planning to do some changes. Does this timing work for you?”

Of course not. Yet, who is in control is an important thing to consider. If my business runs critical end-of-month reports, I probably wouldn’t have scheduled anything to be changed on this last day of the month. (Because, as we all know, things such as routine IT maintenance, software upgrades, etc., don’t always go as planned.)

So perhaps before putting an application in a public cloud, the question to ask is, “Can I accept an unexpected outage at any possible time in this application?” If the answer is no, then maybe you need to run this application on premises where you can control when system upgrades can or cannot occur.

Public clouds can’t be controlled — at least not by any individual business that uses them. Businesses that put their workloads in the cloud give away a fair amount of that control. Knowing this fact, it’s wise to plan accordingly. And more importantly, do an inventory of your cloud workload to assess if they are meeting your current SLA commitments.

A hybrid IT approach gives you more control

Now is the time for every business to take stock of their applications and decide which ones should be in the public cloud and which ones should remain on traditional IT or a private cloud. Thanks to recent innovations in hyperconverged and composable solutions, private cloud options are now better than ever. Speed, agility, and efficiency are standard features in these new offerings, giving businesses all of the benefits of the public cloud without losing any of the control.

With the new innovations around private cloud, why put all your eggs into one basket? Instead, spread out your risk with a strategy that blends the best of public and private cloud.

Many businesses are learning that diversifying risk with a hybrid IT strategy is a smarter approach. It’s time to take back some much needed control.

———————————————–

About Gary Thome

Gary Thome is the Vice President and Chief Technologist for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. He is responsible for the technical and architectural directions of converged datacenter products and technologies including HPE Synergy. To learn how composable infrastructure can help you achieve a hybrid IT environment in your data center, download the free HPE Synergy for Dummies eBook.

To read more articles from Gary, check out the HPE Converged Data Center Infrastructure blog.

Public Cloud Outages: Should You Put All Your Workloads in One Place? TechNative

HPE brings composable IT to VMware Cloud

IT Technician Works on a Laptop in Big Data Center full of Rack Servers. He Runs Diagnostics and Maintenance, Sets up System.

HPE Synergy meets VMware Cloud Foundation

Cloud paradigms and virtual machines have transformed computing, and their impact on the computing industry is only expected to increase over the coming years. However, IT still demands physical devices, and being able to switch between both virtual and physical hardware can be a tremendous boon. HPE Synergy aims to bridge this gap by providing the first composable infrastructure for VMware private clouds.

Composability

In the past, handling IT demands meant looking at specialized hardware and choosing the right processing, storage, and networking devices for a company’s needs. Today, computing is viewed in more abstract terms. HPE’s Synergy provides a powerful means to use hardware for vastly different roles, letting companies transform their IT infrastructure with just a few clicks. In real-world environments, this technology simplifies IT management while providing unprecedented versatility.

In 2017, a competitive business needs “technology that enables them to quickly introduce and scale new services,” according to Ric Lewis, senior vice president and general manager of the Software-Defined and Cloud Group at HPE.

HPE Synergy with VMware Cloud Foundation will deliver a private cloud experience that empowers IT to be an internal service provider and enables rapid response to business needs with single-click DevOps delivery.

A Natural Collaboration

Although the concept of composability is new, some of its fundamental concepts are not. Virtualization has long provided a means for making hardware more versatile, and VMware has been at the forefront of developing this technology for decades. Although cloud computing has changed how businesses view virtualization, the technology behind cloud computing is heavily based on virtualization principles. HPE’s leadership with cloud computing and composability makes the collaboration between VMware and HPE Synergy a natural fit.

HPE brings composable IT to VMware Cloud TechNative
HPE Synergy

The abstract nature of virtualization requires additional overhead compared to more traditional computer usage, and many believe this leads to less efficient hardware utilization. However, virtualization, especially when fueled by composable design, leads to greater flexibility and better hardware usage. When compared to more traditional hardware, HPE’s Synergy delivers virtual machines while lowering costs by 29 percent. Compared to public cloud offerings, HPE’s technology represented savings of 50 percent. These savings are in addition to better productivity combined with lower operating expenses and capital expenditure.

Fighting Fragmentation

Fragmentation is inevitable in IT, and companies benefit by having so many options to choose. However, fragmentation also makes internal management more difficult. Collaboration between HPE and VMware means companies can rely on a more unified design, lowering operating costs and providing additional power. This new technology will also provide powerful automation capabilities by letting HPE’s OneView automate powerful VMware tools. Powerful analytical tools help IT professionals tailor infrastructure to their company’s needs, ensuring better efficiency and smoother operations.

Although public cloud offerings provide excellent service, in-house hardware and software will continue playing a major role in companies across the globe. The concept of composability provides a new and robust way of analyzing hardware, and bringing VMware’s tools into HPE’s Synergy platform offers even greater power.

A few months back we spoke to Ric Lewis from HPE about Synergy and what it can do. Listen below or on Apple Podcasts.

HPE brings composable IT to VMware Cloud TechNative

Cloud28+ expands across Asia Pacific and Japan

Blue tech backdrop with cloud

In Singapore this week, Hewlett-Packard Enterprise is launching its Cloud28+ initiative across the Asia Pacific and Japan region. This expansion marks the significant and ongoing growth of the initiative, which aims to build an international community focused on spurring cloud adoption by fighting against fragmentation and connecting businesses and organizations.

The Cloud28+ initiative creates a one-stop cloud community and catalog for customers of all sizes to use. The platform is especially helpful for today’s businesses, which typically rely on hybrid approaches to IT infrastructure, often composed of multiple cloud environments and partners. Cloud28+ partners tailor their offerings to comply with local regulations, making it easier for companies to build the right environment for their needs.

We recently spoke with Chris Gabriel, Chief Digital Officer at TechPulse Group to hear why Cloud28+ is “a gamechanger” for the IT industry.

Now Available In Local Languages

Resources are available in Korean, Japanese, and English. Cloud28+’s growth has been fast; since its international expansion earlier in 2017, more than 40 HPE partners throughout the APJ region have joined the community. In addition to providing a convenient platform of cloud services across the globe, Cloud28+ also offers  in depth thought leadership to end users to help them learn about how the right cloud resources can work for them. The strength of Cloud28+ lies in its open nature, and partners can contact each other and join forces to respond to complex digital projects together.

HPE isn’t alone is building up the Cloud28+ initiative: Microsoft is also a partner and will work with HPE to form a go to market alliance for Azure Stack. Cloud28+ aims to create an integrated approach to benefit all companies with business relating to the cloud. The initiative will help developers utilize HPE and Microsoft Azure Stack solutions, creating opportunities for new revenue streams and improved market visibility. Customers will benefit by gaining access to a comprehensive collection of solutions, making it easier to navigate the ever-expanding cloud market. Other partners include Intel, Deloitte and Hitachi.

Cloud28+ is free; there’s no cost to search the platform, and there’s no cost associated with connecting to service providers in your region. Customers who don’t know which solution is right for their businesses or require something that meets local compliance regulations have another powerful option: Sending proposal requests direct to partners. Service providers can then send their offers, helping customers learn what options are available and which providers have the expertise their cloud needs demand. Customers looking to explore their options can also turn to Cloud28+’s advanced search features.

Cloud28+ continues to grow, and its recent expansion will add more to the global catalog of services. In total, Cloud28+ boasts 600 partners, and these partners provide more than 20,000 services. Fragmentation is unavoidable in the cloud field, but an open, free, and searchable catalog ensures customers and service providers can make the most of all that the cloud has to offer

Cloud28+ expands across Asia Pacific and Japan TechNative
1 9 10 11 12
Page 11 of 12