Joke Collection Website - Public benefit messages - When kettle data is exported to another database, must the table structure be established first?

When kettle data is exported to another database, must the table structure be established first?

Demand:

1. Have you ever had to migrate all the tables and data in mysql database to Oracle?

2. Are you still drawing repeatedly with kettle: table input-table output, creating tables, worrying?

This is a general database migration process.

Technical guidance:

At the beginning of implementation, a similar example (samples\jobs\process all tables) was found in the example provided by kettle.

Through relevant transformation, the goal is finally achieved.

Analysis of implementation process:

The whole process is divided into two operations and four transportation.

Trans plug-ins used: table input, field selection, copying records to results, obtaining records from results, setting variables, customizing java scripts, and table output.

1. Big project.

2. Get the name of the source library table to be migrated and set it as the result set of the following job.

3. Configure the subjob to execute the subjob once for each previous record (that is, each table).

4. The following are sub-assignments.

5. Get the table name in the record and set it as a variable.

6. Read the result information of the current table and create a table in the target library (this is a difficult point).

Because only the structural information of the table to be extracted needs to be obtained, where 1=2 is added after sql.

The following code is used to create the target library table.

Java code

Public Boolean Process Row (Step Meta Interface SMI, StepDataInterface sdi) throws a KettleException.

{

//First, get a line from the default input hop.

//

object[]r = getRow();

org . pentaho . di . core . database . database meta db meta = null;

java.util.List list = getTrans()。 getRepository()。 read databases(); //Use getDatabases () to get all the database connection information of the repository in //3.x;

If (list! = null & amp& amp! list.isEmpty())

{

for(int I = 0; I< list. size (); i++)

{

db meta =(org . pentaho . di . core . database . database meta)list . get(I);

//The following is the database connection of the target library, which can be modified as required.

if("mysql_test "。 equalsIgnoreCase(db meta . getname()))

{

Break;

}

}

}

if(dbmeta! = empty)

{

org . pentaho . di . core . database . database db = new org . pentaho . di . core . database . database(db meta);

attempt

{

db . connect();

string TABLENAME = get variable(" TABLENAME ");

LogBasic ("Start creating table:"+tablename);

if(tablename! = null & amp& amptablename.trim()。 length()>; 0)

{

String sql = db.getDDL(tablename,data . input rowmeta); //${TABLENAME}

The db. exec statement (SQL.replace (";" , ""));

log basic(SQL);

}

}

Capture (exception e)

{

LogError ("Exception occurred while creating table", e);

} Finally {

db . disconnect();

}

}

Returns false

}

7. Table data migration.

8. Almost enough. I have no problem testing mysql and oracle with mysql. However, during the test, I found that if there is a blob table in the source table, there will be a problem, probably because no field is specified in the table output. I didn't think too much about the specific solution, so I have time to improve it later.

The whole process above is completed under kettle4.3, and the complete process can be downloaded in the attachment. The download address is as follows:

/blog/ 1735434