The faster way to update a billion of lines.

When you asked to convert data by using Oracle built-in functions as trim, to_date, to_number and etc, to migrate data from some columns in a big table on your production system the better path to follow is a CTAS (Create table as select).
On Friday one developer sent me an e-mail asking to use trim in all columns on a big table. For sure with “huge” indexes.

Worst thing that I did…A simple update with set column = trim (column)

SQL> update
tab1
set
col1 = trim(col1)
, col2 = trim(col2)
, col3 = trim(col3)
, col4 = trim(col4)
/

As we know, a lot of time spent and huge charge in UNDOTBS

Then I remembered of one article that I read on Mr. Thomas Kyte, where he advocating the use of CTAS philosophy. I think: “Why not try it?”
On production environment, less downtime is better.

And I did:

SQL> rename table tab1 to table_original;

SQL> create table tab1
tablespace users
as
select
trim(col1) col1
, trim(col2) col2
, trim(col3) col3
, trim(col4) col4
from
table_original;

Now the tab1 data is without space. We must generate scripts of indexes, constraints, comments, grants and etc; this can be achieved with “dbms_metadata.get_ddl” package, and then drop from table_original and recreate on a new tabela called tab1 with updated data.

SQL> select
dbms_metadata.get_ddl(‘INDEX’,”||INDEX_NAME||”,’MY_SCHEMA’)
from
dba_indexes
where
owner=’MY_SCHEMA’ AND table_name=’TABLE_ORIGINAL’
/

Ps.: Remember!!! Don’t use trim in number or date columns in CTAS statement, or your datatypes will be changed to varchar2 in these columns.

Original article at http://asktom.oracle.com

http://asktom.oracle.com/pls/asktom/f?p=100:11:0::NO::P11_QUESTION_ID:6407993912330

Deixe uma resposta

O seu endereço de email não será publicado. Campos obrigatórios são marcados com *