Oracle Consulting Oracle Training Oracle Support Development
Oracle Books
SQL Server Books
IT Books
Job Interview Books
Rampant Horse Books
911 Series
Pedagogue Books

Oracle Software
Write for Rampant
Publish with Rampant
Rampant News
Rampant Authors
Rampant Staff
Oracle News
Oracle Forum
Oracle Tips
Articles by our Authors
Press Releases
SQL Server Books

Oracle 11g Books

Oracle tuning

Oracle training

Oracle support

Remote Oracle


Privacy Policy



Streams Downstream Capture Configuration

Oracle Tips by Burleson 

Down Streams Capture

At times, it may not be feasible or desirable to configure a database link between the source and Down Streams databases. It is because:

  • The Down Streams database is outside a firewall

  • The DBA does not want an in-bound database link to the source database

If the database link does not exist from the Down Streams database to the source database, the following tasks must be completed manually at the source site when configuring Down Streams capture:

  • Dictionary dump and obtain the first_scn value

  • Instantiate replicated database objects

Steps in Case of Implicit Assignment of Logs

The implicit assignment means that the Down Streams Capture process automatically scans all redo log files added by log transport services from the source database to the Down Streams database. This method follows the steps shown below:

Step 1:  Setup up Source Database Environment

At the source database, ensure that the required init.ora parameters used to configure log transport services are present in order to copy the redo log files from the source database to the Down Streams database. For example, the log_archive_dest_2 parameter sets the log transport service at source database, DNYTST10.

LOG_ARCHIVE_DEST_2=' ARCH OPTIONAL NOREGISTER REOPEN=60 TEMPLATE=/db_dumps/oracle/archive/TST1_arc_%t_%s_%r.log'

The NOREGISTER attribute needs to be set to indicate that the location of the archived redo log should not be recorded at the corresponding destination. If this attribute is not specified, the following error may be returned:

ORA-16009: remote archive log destination must be a STANDBY database

The TEMPLATE attribute should be set to the directory specification and format template for archived redo logs at the Down Streams database. This parameter overrides the standby_archive_dest and log_archive_format initialization parameter settings at the remote destination. When specifying a directory within the template attribute value, it is important to make sure that the directory exists, or the following error may be returned:

ORA-00270: error creating archive log

The log_archive_format specification has %s, %t and %r. The %s corresponds to the sequence number and %r corresponds to the resetlogs ID that ensures unique names are constructed for the archived redo logs across multiple incarnations of the database. The %t, which is required for RAC configurations, corresponds to the thread.

Enable the archive destination by using the log_archive_dest_state_2 parameter:


At both the source database and the Down Streams database, the remote_archive_enable initialization parameter should be set to TRUE.


Step 2:  Setup Up Supplemental Logging at Source Database

At the source database, specify primary key supplemental logging for the table desired. For example, to specify logging information for the table ALLINSURED1, the following SQL statement can be run:

ALTER TABLE NY1.allinsured1

Step 3:  Create Queue and Capture process

The next step is to create the queue and the Capture process at the Down Streams database. There are two methods that can be used to create the Capture process. One is to create the Capture process where the database link to source database is specified, and the other method is to not use the database link.

For example, by using the following SQL block a Capture process will be created where database link to source database is specified:

  queue_name             => 'LN1_queue',
  capture_name           => 'LN1_capture',
  rule_set_name          => NULL,
  start_scn              => NULL,
  source_database        => '',
  use_database_link      => TRUE,
  first_scn              => NULL,
  logfile_assignment     => 'implicit');

In the above example, use_database_link => TRUE.  This implies that the Capture process uses a database link with the same name as the source database global name to perform administrative actions at the source database.

The parameter logfile_assignment is set to IMPLICIT. It specifies that the Capture process accepts new redo log files implicitly from Therefore, the Capture process scans any new log files copied from DNYTST10.

Step 4: Rules for Capture Process

Create the positive rule set for the Capture process and add a rule to it. Typically, the add_table_rules procedure of the dbms_streams_adm package is used for this process.

Step 5: Propagation and Apply

Now Propagation, Apply, or both can be configured for the LCRs captured by the LN1_CAPTURE Capture process. The instantiation_scn for the table ALLINSURED1 at should be set by running the set_table_instantiation_scn procedure of the dbms_apply_adm package at the destination database

Not Using the Database Link

In Step 3 above, while creating the Capture process by using the create_capture procedure, there are some additional steps needed if the use_database_link is to be set to FALSE. A build of the source database data dictionary needs to be performed in the redo log.  Since there is no database link, any configuration at a Down Streams database can not perform or execute database operations at the source database. Therefore, manual intervention is warranted.

For example, use the following SQL block to perform the data dictionary build:

 scn NUMBER;
 DBMS_CAPTURE_ADM.BUILD( first_scn => scn);
 DBMS_OUTPUT.PUT_LINE('First SCN Value = ' || scn);

This procedure displays the valid first_scn value, 155781561, for the Capture process that will be created at the Down Streams database. Jot down the SCN value from above step. This value will be used when creating the Capture process at the Down Streams database.

Next, prepare the required table for instantiation. For example run the following SQL block to instantiate the table ALLINSURED1:

table_name             => 'NY1.allinsured1');

The Capture process where use_database_link is set to FALSE can be created as shown below.

queue_name                => 'ln1_queue',
capture_name              => 'ln1_capture',
rule_set_name             => NULL,
start_scn                 => NULL,
source_database           => '',
use_database_link         => false,
first_scn                 => 155781561, -- Use value from build proc
logfile_assignment        => 'implicit');

The first_scn for the Capture process has been specified. This value was obtained earlier at the source database. The first_scn is the lowest SCN for which a Capture process can capture changes.

Note: Since a first_scn is specified, the Capture process creates a new LogMiner data dictionary when it is first started, regardless of whether there are existing LogMiner data dictionaries for the same source database.


The above text is an excerpt from:

Oracle Streams
High Speed Replication and Data Sharing

ISBN 0-9745993-5-2

by Madhu Tumma

Download your Oracle scripts now:

The definitive Oracle Script collection for every Oracle professional DBA

Linux Oracle commands syntax poster

ION Oracle tuning software

Oracle data dictionary reference poster

Oracle Forum

BC Oracle consulting support training

BC remote Oracle DBA   



 Copyright © 1996 -2017 by Burleson. All rights reserved.

Oracle® is the registered trademark of Oracle Corporation. SQL Server® is the registered trademark of Microsoft Corporation. 
Many of the designations used by computer vendors to distinguish their products are claimed as Trademarks