The beauty of Oracle Streams

I’ve said it already many times and I keep repeating it: Oracle Streams is a wondeful feature. Consider the following case I encountered recently:

A large (+250Gb) and very critical database on single-node non-rac server needs to be migrated to a 10.2 RAC system on ASM with minimal downtime. How would you do this?

Option 1: export/import? Too slow. Database contains 80% of CLOBs which would take ages to import.

Option 2: upgrade using dbua or catpatch. Will be faster but

  • Requires time to transfer the datafiles to the new system
  • Requires rman intervention to get the files in ASM + a lot of testing
  • Turns out that the database has nearly all options installed (olap, data mining, spatial), while on the new server we don’t want them, because we don’t pay for them. So this would mean that we would have to drop these options before starting the migration, but this would give us no chance to test this in advance. And when doing a highly critical migration, I want to have everything tested carefully.

Option 3: data pump? Over a database link? Will be faster than exp/imp, but might still take several hours. Especially when you know that old and new servers are in different data centers with a not-so-fast connection inbetween (don’t know the exact speed).

Option 4: logical standby as 10.2 for the primary? This would allow a quick switchover at the moment of migration, but gives the same problem as the catpatch option: how to get rid of the installed options in a safe way when you use 10.2 software in which these options are not installed.

Option 5: Stream? Of course… Look at the benefits:

  • This option allows to have only the schema(s) replicated containing application data. So the problem of the spatial, data mining and olap options is solved
  • Streams can work between 10.1 and 10.2 databases
  • We start with an empty database created by dbca in ASM, so no stuff to do with rman to copy files from file system into ASM
  • Can be combined with datapump over a database link, so no intermediate temp space required during the migration. Note that this is an operation in advance, so the time for copying the data does not matter. Streams will easily catch up afterwards.
  • Just like logical standby, the new database is open and readable before the migration, so you can continuously check if all changes are received correctly before doing the final switch, and you have the guarantee in advance that the new database is valid and correct.

However there was one concern I had. Adding a capture process on the old database is an important change to do on a highly critical database. What if this capture process would cause too much overhead? Could it cause any kind of process or even instance failure on the old database? Is it safe to add this to a highly critical database?

Well, there is a great solution: downstream capture. The architecture is as follows: the current production sends its archivelogs to the new 10.2 database. This requires only adding an extra, optional, log_archive_dest_x, which is safe to do. The new database receives the archivelogs and runs the capture process, which enqueues it logical changes on a queue in this 10.2 database, i.e. all the sql done on the application tables. (yes, a 10.2 capture can read 10.1 archivelogs). A local apply process reads these changes from the queue and applies them in the 10.2 database. In this way, the new database is kept in sync with the old without any overhead on the old, i.e. a very safe operation in a critical environment. The final switch will consist of stopping all activity on the old database, switch logfile and verify that the last one is applied, and within seconds all is done.

I have to admit that there is a minimal impact on the primary. In order for streams to work best, it is recommended to have primary or unique keys on all tables (which we have, our developers are really good). Additionally supplemental logging must be enabled. This will cause to write the primary key colums in the redolog for every update, even if only non-pk columns are updated. However, my experience tells me that this is also a very safe change with minimal overhead (at least if PK or UN keys exist, otherwise all columns will be written in the redolog).

So to conclude, Oracle Streams with downstream capture allows to prepare the migration completely, guarantees that the new database is correct before starting the final migration, perform no risky changes to the old database, gives no issues to get files from file systen into ASM, requires no intermediate temp space and allows the final switch to be done in a few seconds. That’s what I call the beauty of Streams.


6 Responses to The beauty of Oracle Streams

  1. […] The Beauty of Oracle Streams Geert Depaep a collegue and friend at Uptime has a very interesting article about using Oracle Streams for migrating your live database to a higher or even lower releas or even accross platforms. Read for yourself at his blog : […]

  2. […] Recently I described the use of Oracle Streams to do an Oracle upgrade. It allows to create a new empty database of a higher version, and then use […]

  3. Ravi says:


    I am trying to find ways to speed up my streams database. It is painfully slow. Any suggestions would be greately appreciated.


  4. Bob says:

    Regarding slowness in the post by Ravi…

    Have you figured out a solution? Was it caused by using “downstream capture” where the logs have to be archived before the data can be replicated? I am just setting up a system, will not use downstream, but am still concerned about the latency of the data.

  5. I follow your blog for quite a long time and must tell you that your articles are always valuable to readers.
    p.s. Year One is already on the Internet and you can watch it for free.

  6. Damir Vadas says:


    Where can I find something more about this? Looks promising ….


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: