Oracle Database - How can we load a large number of rows into an indexed existing table ?

> Database > Oracle Database

1 - About

If you have to perform Data Processing - (Batch|Bulk) Processing data load into a Data Warehouse or data mart, you must skip the index to minimize the generation of redo log, such as :

  1. set the indexes to the UNUSABLE state. They are not dropped but just setting as unusable
  2. re-enable the indexes

Why not just drop the index ?
Because the command CREATE INDEX can failed and if no one notice it, the performance goes down.
If the command to re-enable the index fails, when the users run queries that need that index, they will get an error message.


3 - Example

This example is the end of this article : Oracle Database - Why you have still a lot of redo and archive ? The Index side effect.

gerardnico@orcl>ALTER INDEX big_table_idx unusable;
INDEX altered.
gerardnico@orcl>ALTER session SET skip_unusable_indexes=TRUE;
Session altered.
gerardnico@orcl>INSERT /*+ APPEND */ INTO big_table SELECT * FROM all_objects;
66652 rows created.
       6308  recursive calls
       1545  db block gets
     187973  consistent gets
          0  physical reads
    7606756  redo size
        893  bytes sent via SQL*Net TO client
        966  bytes received via SQL*Net FROM client
          4  SQL*Net roundtrips TO/FROM client
       1550  sorts (memory)
          0  sorts (disk)
      66652  rows processed
gerardnico@orcl>ALTER INDEX big_table_idx rebuild nologging;
INDEX altered.
db/oracle/bulk_load_index.txt ยท Last modified: 2018/10/10 10:16 by