You’ve probably heard of cloud repatriation, or the act of bringing workloads once in the public cloud back on premises. Public discourse often tries to paint it in mythic proportions: Is it even real? And if so, how much is actually happening?
The rise of the public cloud fundamentally shifted how IT operated. Many organizations moved to the cloud to take advantage of its scalability, flexibility and pay-per-use model. It’s not controversial to say the public cloud comes with many benefits, as well as challenges and risks. As organizations have matured in their public cloud use, many have weighed the advantages and disadvantages to determine the optimal strategy that best aligns with their goals and objectives. For some, this strategy has included moving some workloads once in the public cloud to an on-premises or colocation facility. For example, Dropbox improved its margins 34% over a two-year period moving workloads from the public cloud to in-house and co-location infrastructure. 37signals estimates it will save $7 million over five years.
It stands to reason then that cloud repatriation may be considered a workload placement tactic in a broader multicloud strategy. This means some are advantaged by the public cloud’s rapid scalability, some are great on-premises and some may benefit from the growing possibilities at the edge.
Understanding why organizations repatriate workloads can help you think through your own multicloud strategy. Below are some reasons to consider as you make informed decisions about your own IT infrastructure.
-
- Lower Operational Costs
There are numerous examples of on-premises infrastructure being less expensive than cloud-based solutions. For example, for steady-state workloads that don’t require rapid scalability, many IT leaders find significant cost efficiencies running on-premises versus the public cloud. For example, Ahrefs recently theorized it’s saved $400 million by not going to the public cloud. Many have also found similar efficiencies when it comes to cloud management tools, third-party services or additional cloud-based resources required for performance optimization.
-
- More Predictable and Transparent Cost Structures
With your infrastructure investments predominantly in the public cloud, the variable pricing model can sometimes be difficult to predict. The need to rapidly scale – for example, a new app you launched was a runaway hit – may also result in runaway cloud costs. This can make it difficult to understand your cloud bills each month. In contrast, on-premises infrastructure investments typically come with more budget stability and predictability, and you don’t have to worry about fluctuations in public cloud pricing. At Dell, we even offer pay-per-use solutions with a single billing rate so you can accurately predict future costs.
-
- Better Resource Utilization
If you already own on-premises infrastructure, you probably have a keen interest in making the most of that investment. Being smart about what workloads are run on-premises can mean better utilizing existing capacity – and avoiding additional costs. This ensures your organization is making the most of your hardware investments and avoids the costs associated with underutilized or overprovisioned resources. You also just have more control over resources under your own roof. You can be selective about equipment purchases rather than paying for pre-allocated resources in the public cloud.
-
- Fewer Data Transfer Costs
There are fees when data is transferred between public cloud providers or from the public cloud to on-premises infrastructure, and over time they can really cost you. Data egress fees are a common complaint about the public cloud, so much that there are articles written about how to avoid them. It makes sense that repatriating workloads would help here: if the data is not in the public cloud, you won’t have to pay to access or move it. Moving workloads into an on-premises environment or a colocation facility allows you to maintain control of that data and even take advantage of public cloud services while avoiding unnecessary transfer fees.
-
- Reduced Risk of Vendor Lock-in
The less you have invested in a particular vendor’s ecosystem, the less dependent you are on that particular vendor, and the easier it is to shift away from them should you need (or want) to do so. This gives you more options if you want to take advantage of cutting-edge technologies in another vendor’s ecosystem or just want to be in a better negotiating position. Bringing workloads back on-premises is an option here, too, providing more control over infrastructure, often with more cost efficiencies over the public cloud.
Reminder: Cloud Repatriation is Just One Tactic in a Multicloud Strategy
Ultimately, a strong multicloud strategy incorporates different capabilities from multiple cloud environments and embraces a diverse ecosystem to best meet business needs. This means workloads may be in a range of locations: public clouds, colocation facilities, edge locations and, yes, on-premises. With careful consideration, you can determine the right mix for your organization that unlocks all the benefits the cloud experience with maximum cost efficiencies and without trade-offs on performance, security and control over data. It’s one of the reasons we created our Dell APEX portfolio of as-a-service solutions, to help customers simplify their cloud experience and accelerate business results.
Here’s where you can learn more about Dell APEX.