Oracle Data Pump is a utility that allows you to export Oracle data to a dump file and import it into another Oracle database. Oracle Data Pump is the recommended way to move large amounts of data from an Oracle database to an Amazon AWS RDS DB instance. As we well aware of, AWS provides two types of services like EC2 and RDS. EC2 service provides us full control of access with root user, while RDS does not give us any kind of OS level access. AWS RDS instance provides us a master admin user with limited admin privileges. Import scheam on AWS RDS.
Exporting Oracle through Data pump and Import schema on Amazon AWS RDS using S3 Bucket
In this blog post I’ll explain step-by-step how to Import schema on AWS RDS using Oracle built-in packages to have an OS level access over AWS RDS environment.
Summary
- Export a schema with name “EXP_DBTEST_2023” on premises oracle database version 19.0.0.0.0 resides on oracle Linux server.
- Upload schema “EXP_DBTEST_2023” to AWS S3 Bucket.
- Download dump file to data pump directory of AWS RDS database.
- Import the dump file “EXP_DBTEST_2023” into AWS RDS database with the changes the schema name “DBTEST_RDS”. Import scheam on AWS RDS
Prerequisites
- Create S3 Bucket under same region from AWS account management console.
- Grant access role with read/write privileges.
- Upload dump file to S3 Bucket.
- Download dump file to RDS data pump directory.
- Allow access from EC2 instance to RDS to perform dump import/export activities.
**S3 bucket is a kind of a shared storage over AWS cloud systems for RDS instances where you can upload/download the files.
Step 1: Export Schema
Export the schema on Source EC2 instance or local Database Server with following DB script.
expdp system/password DUMPFILE= EXPORT_ DBTEST_2023.dmp LOGFILE= EXPORT_ DBTEST_2023.log FULL=Y COMPRESSION=ALL
Note:
In case you are importing from Enterprise Edition DB to Standard Edition DB, make sure to reset all tables having COMPRESSION option enabled to NOCOMPRESS before exporting the data. This is because Standard Edition doesn’t have COMPRESSION feature.
alter table DBTEST_2023.compressed_table NOCOMPRESS;
Step 2: Configure AWS S3 Bucket
To create and configure S3 Bucket from browser windows on your machine, us the following steps.
Open S3 Browser -> Accounts -> Add New Account:
Account Name: arn:aws:s3:::test-ora-test
Account Type: Amazon S3 Storage
Access Key ID: AFFFTFWNINVCRTWLBWII5Q
Secret Access Key: EdfeergzA1+kErss2kg53nudSv/Il/xx2x2wvwiuy
Click “Add New Account”
Accounts -> click “rds-bucket” -> Click “Yes” to add ‘External bucket’ -> Bucket Name: “rds-bucket”.
Step 3: Upload DB schema to AWS S3 Bucket
- Upload the exported dump file to S3 Bucket from Source EC2 instance or local on premises database server.
- In my case I am exporting schema from local on prem database server. In such case upload dump file to S3 bucket from browser window.
- Use the Amazon RDS procedure
rdsadmin.rdsadmin_s3_tasks.upload_to_s3
to copy the dump file to the Amazon S3 bucket.
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(
p_bucket_name => ‘arn:aws:s3:::test-ora-test’,
p_s3_prefix => ‘DBDUMP/ EXPORT_ DBTEST_2023.dmp’,
p_directory_name => ‘DATA_PUMP_DIR’)
AS TASK_ID FROM DUAL;
Note:
S3 Browser is a Windows GUI tool provided by AWS that help you deal with uploading/downlading the file to/from S3 bucket. you can download it from here:
https://s3browser.com/download.aspx
Step 4: Create user on the RDS database instance and grant privileges to the database user.
To create users and grant necessary privileges on the RDS for Oracle target instance.
- Use SQL*Plus or Oracle SQL Developer to log in as the master user to the RDS for Oracle DB instance into which the data will be imported.
- Create the user account and grant the necessary permissions and roles if the user account into which the data is imported doesn’t exist.
CREATE USER DBTEST_RDS IDENTIFIED BY password default tablespace RDSTEST_tbs; GRANT CREATE SESSION, RESOURCE TO DBTEST_RDS;
ALTER USER schema_1 QUOTA 100M ON users;
Step 5: Create the required tablespaces before you import the data.
To create table space on RDS use the following query.
CREATE SMALLFILE TABLESPACE “RDSTEST_TBS” DATAFILE SIZE 100M AUTOEXTEND ON NEXT 100M LOGGING EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO;
Step 6: Download the dump file from your Amazon S3 bucket to your target DB instance.
To download your dump file
- Start SQL*Plus or Oracle SQL Developer and log in as the master on your Amazon RDS target Oracle DB instance.
- Download the dump file using the Amazon RDS procedure
rdsadmin.rdsadmin_s3_tasks.download_from_s3
The below example downloads all files from an Amazon S3 bucket named arn:aws:s3:::test-ora-test to the directory DATA_PUMP_DIR
.
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
p_bucket_name => ‘arn:aws:s3:::test-ora-test ‘,
p_s3_prefix => ‘export_files/’,
p_directory_name => ‘DATA_PUMP_DIR’)
AS TASK_ID FROM DUAL;
If you want to download a specific dump file from the particular directory, use the following RDS query Package.
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
p_bucket_name => ‘arn:aws:s3:::test-ora-test’,
p_s3_prefix => ‘DBDUMP/EXPORT_ DBTEST_2023.dmp,
p_directory_name => ‘DATA_PUMP_DIR’)
AS TASK_ID FROM DUAL;
Once the download is complete, query the downloaded files under DATA_PUMP_DIR using this query:
select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;
How to resolve ORA-00904
In case you get the below error message while downloading dump file from S3 Bucket:
ORA-00904: “RDSADMIN”.”RDSADMIN_S3_TASKS”.”DOWNLOAD_FROM_S3″: invalid identifier
This means S3 integration is not configured with your RDS.
To configure S3 integration: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html
Open the IAM Management Console: https://console.aws.amazon.com/iam/home?#/home
In the navigation pane, choose Policies -> Create policy On the Visual editor tab, choose Choose a service, and then choose S3 -> Check All S3 actions
Choose Resources, and choose Add ARN for the bucket -> Enter the Bucket name: arn:aws:s3:::test-ora-test
Click Review Policy -> Give it a name “eps-s3-integration” -> Create Policy
Associate your IAM role with your RDS DB:
Sign in to the AWS Management Console: https://console.aws.amazon.com/rds/
Choose the Oracle DB instance name -> On the Connectivity & security tab -> Manage IAM roles section:
IAM roles to this instance: -> “eps-s3-integration”
Feature -> S3_INTEGRATION
Click “Add role”.
Step 7: Import your dump file into your target DB instance using DBMS_DATAPUMP
Open a session from SQL Developer and make sure this session will not disconnect as far as the import is running, by the RDS master user execute the following block of code which will keep running in the foreground allowing you to monitor the import job on the fly and see any incoming errors. Import scheam on AWS RDS with below script package.
DECLARE
ind NUMBER; — Loop index
h1 NUMBER; — Data Pump job handle
percent_done NUMBER; — Percentage of job complete
job_state VARCHAR2(30); — To keep track of job state
le ku$_LogEntry; — For WIP and error messages
js ku$_JobStatus; — The job status from get_status
jd ku$_JobDesc; — The job description from get_status
sts ku$_Status; — The status object returned by get_status
BEGIN
h1 := DBMS_DATAPUMP.OPEN( operation => ‘IMPORT’, job_mode => ‘SCHEMA’, job_name=>null);
— Specify the single dump file and its directory
DBMS_DATAPUMP.ADD_FILE(handle => h1, directory => ‘DATA_PUMP_DIR’, filename => ‘ EXPORT_ DBTEST_2023.dmp’);
— Specify the logfile for the import process: [Very important to read it later after the completion of the import]
DBMS_DATAPUMP.ADD_FILE(handle => h1, directory => ‘DATA_PUMP_DIR’, filename => ‘ EXPORT_ DBTEST_2023.LOG’, filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
— Disable Archivelog for the import: [12c new feature]
–DBMS_DATAPUMP.metadata_transform ( handle => h1, name => ‘DISABLE_ARCHIVE_LOGGING’, value => 1);
— REMAP SCHEMA:
DBMS_DATAPUMP.METADATA_REMAP(h1,’REMAP_SCHEMA’,’OLD_SCHEMA’,’NEW_AMS’);
— If a table already exists: [SKIP, REPLACE, TRUNCATE]
DBMS_DATAPUMP.SET_PARAMETER(h1,’TABLE_EXISTS_ACTION’,’SKIP’);
— REMAP TABLESPACE:
–DBMS_DATAPUMP.METADATA_REMAP(h1,’REMAP_TABLESPACE’,’EPS’,’TBS_EPS’);
— Start the job. An exception is returned if something is not set up properly.
DBMS_DATAPUMP.START_JOB(h1);
— The following loop will monitor the job until it get complete.meantime the progress information will be displayed:
percent_done := 0;
job_state := ‘UNDEFINED’;
while (job_state != ‘COMPLETED’) and (job_state != ‘STOPPED’) loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
— If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line(‘*** Job percent done = ‘ ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
— If any work-in-progress (WIP) or Error messages were received for the job, display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
— Indicate that the job finished and gracefully detach from it. dbms_output.put_line(‘Job has completed’);
dbms_output.put_line(‘Final job state = ‘ || job_state);
dbms_datapump.detach(h1);
END;
/
Step 8: Clean up
After the data has been imported, you can delete the files that you don’t want to keep.
To remove unneeded files
- Start SQL*Plus or SQL Developer and log in as the master user to your RDS for Oracle DB instance.
- List the files in
DATA_PUMP_DIR
using the following command.
SELECT * FROM TABLE(rdsadmin.rds_file_util.listdir(‘DATA_PUMP_DIR’)) ORDER BY MTIME;
- Delete files in DATA_PUMP_DIR that you no longer require, use the following command.
EXEC UTL_FILE.FREMOVE(‘DATA_PUMP_DIR’,’filename’);
EXEC UTL_FILE.FREMOVE(‘DATA_PUMP_DIR’,’sample_copied.dmp’);
Conclusion
In this Blog post we performed the activity how to Import scheam on AWS RDS. Oracle Provides the utility data pump to import from local database server to AWS RDS instance through data pump utility. we followed the steps like imports dump, create S3 bucket account, upload/download dump file to data pump directory. Finally import the dump using data pump utility to AWS RDS instance using script packages.
Read More Topics
How to Resolve ORA-28007: the password cannot be reused