Moving Mountains of Data off S3 with Jeremy Daer
By 37signals
Categories: Startup, Product
Summary
Migrating 5 petabytes of data from 5 billion S3 objects is daunting, but 37signals found a way to do it painlessly by leveraging AWS's exit programs and a well-planned framework for proving the new solution before investing millions.
Key Takeaways
- Build a well-planned framework to prove the viability of a new data storage solution before investing heavily.
- Leverage cloud provider exit programs that provide credits to offset data egress costs when migrating from their platform.
- Understand the hidden costs and constraints of cloud storage, such as having to pay for data egress, which can create a vendor lock-in.
- Anticipate the difficulty and risk of migrating massive amounts of mission-critical data, and build a framework to mitigate those challenges.
- Leverage experienced account reps from cloud providers to navigate the complexities of their exit programs and data migration.
- Embrace the fact that migrating data out of the cloud, even from a trusted provider like AWS, requires significant planning, risk assessment, and execution to do it right.
Topics
- Data Migration
- Cloud Storage Optimization
- Vendor Lock-in Mitigation
- Cloud Provider Exit Strategies
- Risk Management in Data Migrations
Transcript Excerpt
Well, yeah, there's no real difficulty. It's just copying some stuff, right? Among those five pabytes of data is spread across hundreds of buckets and uh on the order of about 5 billion objects. The concept is straightforward, but you got to be careful and do it right. And there are easy ways to do it right, but there are a lot of easy ways to go wrong too. >> This is for portables, a place where the team at 37 Signals shares their behindthescenes work building base camp. pay and opensource proj...