Tuesday 3 January 2012

Import/Export data between Peoplesoft instances

To move data between Peoplesoft instance we have many tools. Data mover is a peoplesoft delivered tool, but is limited to peoplesoft objects only, and is very slow for huge objects. DM sometimes hang and behave strangely in large production systems. IMP/EXP are the traditional oracle tools used for import and export. IMP/EXP have some limitations especially with encrypted columns. From 10g onwards the most usefull data movement tool with oracle is impdp and expdb the enhanced version od IMP/EXP. I will explain more on Data Pump options here

exporting/importing a whole schema (SYSADM)
creating DB directory
CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';GRANT READ, WRITE ON DIRECTORY test_dir TO dba;

expdp dba/pwd@source schemas=SYSADM directory=TEST_DIR dumpfile=SCHEMA.dmp logfile=SCHEMA.log parallel=4
give parallel option based on your system capacity
logfile and dumpfile will be created in TEST_DIR. to find the TEST_DIR path in server do a select * from dba_directories
move DMP files to target TEST_DIR directory befopre import
impdp dba/pwd@target schemas=SYSADM directory=TEST_DIR dumpfile=SCHEMA.dmp logfile=SCHEMA.log TABLE_EXISTS_ACTION=REPLACE

Exporting/Importing few tables

expdp dba/pwd@source tables=SYSADM.PS_A,SYSADM.PS_B directory=TEST_DIR dumpfile=DMP_TABLES.dmp logfile=DMP_TABLES.log
impdp dba/pwd@target tables=SYSADM.PS_A,SYSADM.PS_B directory=TEST_DIR dumpfile=DMP_TABLES.dmp logfile=DMP_TABLES.log TABLE_EXISTS_ACTION=REPLACE

INCLUDE and EXCLUDE

The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects. When the INCLUDE parameter is used, only those objects specified by it will be included in the export/import. When the EXCLUDE parameter is used, all objects except those specified by it will be included in the export/import. The two parameters are mutually exclusive, so use the parameter that requires the least entries to give you the result you require. The basic syntax for both parameters is the same.
expdp dba/pwd@src schemas=SYSADM include=TABLE:"IN (PSSTATUS, 'PSOPTIONS')" directory=TEST_DIR dumpfile=TEST.dmp logfile=TEST.log
expdp
dba/pwd@tgt schemas=YSADM exclude=TABLE:"= PSOPTIONS" directory=TEST_DIR dumpfile=TEST.dmp logfile=TEST.log
the parameter is used from the command line, depending on your OS, the special characters in the clause may need to be escaped, as follows. Because of this, it is easier to use a parameter file.
include=TABLE:\"IN (\'EMP\', \'DEPT\')\"

The valid object type paths that can be included or excluded can be displayed using the DATABASE_EXPORT_OBJECTS, SCHEMA_EXPORT_OBJECTS, and TABLE_EXPORT_OBJECTS views.
INCLUDE=TABLE,VIEW,PACKAGE:"LIKE '%API'"
or
INCLUDE=TABLE
INCLUDE=VIEW
INCLUDE=PACKAGE:"LIKE '%API'"


IMPDP across network links.
You cannot import a dmp file into a 11gR2 Db if the export is taken from 11gR1. In such cases we can use impdp command across DB link (network link). Here are the steps
Say we need to do a schema refresh from SRC(11gR2) to TGT DB(11gR1)
1. Create a DB link in TGT and point to SRC and test the connection.say dblink name in TGT is SRCDBLINK.WORLD
2. In this case, TGT is 11gR1 and SRC is 11gR2, the script had to be executed from a machine where the client is 11gr1, if you execute from 11gr2 client, it threw the error “UDI-00018: Data Pump client is incompatible with database version 11.01.00.07.00”
3. Execute from 11gR1 client impdp dba/pwd@TGT network_link=SRCDBLINK.WORLD schemas=SYSADM DUMPFILE=TEST.dmp table_exists_action=replace logfile=test.log
this will copy data across network link from SRC to TGT

Issue the following command to know all the options available with impdp and expdp
expdp help=y
impdp help=y
Status of jobs
Query
select * from dba_datapump_jobs;
Unlike the original exp and imp utilities all data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.
All data pump actions are performed by multiple jobs (server processes not DBMS_JOB jobs). These jobs are controlled by a master control process which uses Advanced Queuing. At runtime an advanced queue table, named after the job name, is created and used by the master control process. The table is dropped on completion of the data pump job. The job and the advanced queue can be named using the JOB_NAME parameter. Cancelling the client process does not stop the associated data pump job. Issuing "ctrl+c" on the client during a job stops the client output and presents a command prompt. Typing "status" at this prompt allows you to monitor the current job.


The following commands are valid while in interactive mode.Note: abbreviations are allowedCommand
CONTINUE_CLIENT : Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT : Quit client session and leave job running.
KILL_JOB : Detach and delete job.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the Data Pump job.

Issues
IMPDP Stuck with wait event - "wait for unread message on broadcast channel"
I queried dba_datapump_jobs and found out that I have an old job in "NOT RUNNING" state:
SQL>
select * from dba_datapump_jobs;
I saw one job not running. Need to delete this in order to move ahead.
Checked to find the underlying object:
SQL>
SELECT a.object_id, a.object_type, a.owner, a.object_name, a.statusFROM dba_objects a, dba_datapump_jobs jWHERE a.owner=j.owner_name AND a.object_name=j.job_nameand j.job_name='SYS_IMPORT_FULL_01';
SYS_IMPORT_FULL_01 is the job name found from first query

SQL>
select owner,object_name,subobject_name, object_type,last_ddl_time from dba_objects where object_id=984119
object id found from above query
Dropped the datapump job table:
SQL> drop table SYSTEM.SYS_IMPORT_FULL_01;
Table dropped.
And then my impdp job started progressing:
NOTE: Please make sure that the old job is NOT RUNNING.Secondly, if its a known job you can also try to:impdp username/password attach=JOB_NAMEand then at the IMPDP prompt do a kill_job. This should also clear the jobs status from dba_datapump_jobs.


usefull links
http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_api.htm
http://peerdba.wordpress.com/2011/01/13/killing-expdpimpdp-job/


1 comment:

  1. Thanks for sharing valuable information. Your blogs were helpful to AWS learners. I
    request to update the blog through step-by-step. Also, find the AWS news at
    AWS Online Training

    ReplyDelete