Sunday, February 18, 2018
Home » Amazon Web Services » Best Practices and Updates for AWS Database Migration Services

Best Practices and Updates for AWS Database Migration Services

Our partners at AWS have a plethora of resources available to AWS users touching on a wide variety of subjects. One recently released white paper, “AWS Database Migration Services Best Practices,” dives into the ins and outs of migrating from a source database to a target database. It can be a lot to digest, and as a Relational Database Services partner, we’ve worked extensively with the AWS Database Migration Service since its launch, and can provide some insight on how to best utilize it.

Companies looking to modernize to a cloud platform or a modern database – or perhaps migrate for licensing reasons – often have had to plan for extensive outages when making the change. The AWS Database Migration Service (AWS DMS) enables organizations to migrate data to and from a variety of databases located in Amazon Relational Database Service (Amazon RDS), those running on Amazon EC2, and also those running on-premises. The service supports homogeneous migrations, such as Oracle to Oracle, as well as heterogeneous migrations, such as Oracle to Amazon Aurora. Whether it’s a homogeneous migration or not, the service tracks changes being made to the source database so that they can be applied to the target database to keep the two in sync. You can also customize table mappings and perform translations.

When using AWS DMS, a user provisions a replication server, defines source and target endpoints, and creates a task to migrate the data. A standard task has three major phases: the full load, the application of cached changes, and ongoing replication.

There are a handful of important considerations when creating a task. You’ll want to choose your migration type, either by migrating existing data, replicating changes, or a combination of both. You’ll also need to determine whether you want a task to start upon creation (the default on AWS), and tell AWS DMS what to do with tables that already exist. AWS DMS also includes options for dealing with large objects (LOBs). Since they require more processing and resources than standard objects, you can choose to either not include them, have a limited LOB mode, or go full LOB, and AWS DMS will not discriminate based on the size of objects. It’s also a smart approach to enable logging, as that will keep you updated on any informational, error, or warning messages.

In addition, AWS just announced enhancements for Amazon RDS for PostgreSQL. Users can now use RDS for PostgreSQL as a source for AWS DMS. From the AWS blog:

Amazon RDS for PostgreSQL now supports logical replication on RDS for PostgreSQL versions 9.4.9 and 9.5.4. With logical replication, we have added support for logical decoding, and you can setup logical replication slots on your instance and stream changes from the database through these slots. A new rds_replication role, assigned to the master user by default, can be used to grant permissions to manipulate and stream data through logical slots. To enable logical replication, set the parameter rds.logical_replication to 1. These logical slots also enable the RDS for PostgreSQL to be used as a source for AWS Database Migration Service (DMS).

Finally, AWS recently announced continuous data replication for DMS, which enables you to keep your database up to date after the initial migration. With this feature, you can now have high availability during the replication process by specifying Multi-AZ when you create your replication instances.

You can learn more about using RDS for PostgreSQL as a source for DMS in the AWS Database Blog and RDS documentation. You can also refer back to our previous blog posting on AWS DMS when it was launched into preview mode earlier this year. And to dive further into AWS DMS, check out the Best Practices whitepaper.

About David Lucky

David Lucky
As Datapipe’s Director of Product Management, David has unique insight into the latest product developments for private, public, and hybrid cloud platforms and a keen understanding of industry trends and their impact on business development. David writes about a wide variety of topics including security and compliance, AWS, Microsoft, and business strategy.

Check Also

Digital Transformation at AWS Summit Chicago 2017

Last month, 11,000 people gathered at the McCormick Place in Chicago, Illinois for the annual AWS Summit Chicago. Attendees came together to learn about AWS’ latest service offerings and client use cases during the more than 40 technical sessions available to choose from over the course of the two-day event. As a longtime AWS Premier Consulting Partner, Datapipe also attended the event. Check out our recap of the conference.