NAME

SQL_tableload - copy from a persistent database to an in-memory table

SYNOPSIS

package require Humelib
humelib::SQL_tableload sql_tablename {sql_cols *} {row_condition {}} {SQL_tablename {}} {SQL_cols *} {conn sql}

DESCRIPTION

This command is used to copy table data from a persistent database to an in-memory table. Optionally, a subset (selected columns/selected rows) of the data may be loaded. Optionally, the column names or tablename may be different. The column data types may be different but need to be compatible, such as DATE and VARCHAR. The in-memory table should already exist.

The command arguments are:

sql_tablename
The tablename in the persistent database.
sql_cols
A list of column names in the persistent database table. Defaults to all columns.
row_condition
The selection expression of a where clause such as {username='bill' or username='bob'} can be provided to specify which rows are to be copied. The default is to copy all rows.
SQL_tablename
The in-memory tablename. Defaults to the same as the persistent database tablename.
SQL_cols
The column names in the in-memory table in ordered correspondence to the sql_cols column list. The default is to use the column names provided by the value of sql_cols, or discovered from the schema of the persistent database table, if the value if sql_cols is also *.
conn
The name of the ODBC connection; defaults to sql. The odbc connection should already exist before the command is used.

An error is returned if a table or the ODBC connection does not exist. The usual return value is a two element list consisting of the number of rows queried from the persistent database, and the number or rows successfully inserted into the in-memory table. The counts may be different if the in-memory table already has data with the same key values.

The command logic uses the odbc process_statement command to reduce the peak memory requirements needed to copy the table data.

The command is part of the Humelib package and it may be imported from the humelib namespace.

EXAMPLE

package require Humelib
namespace import -force humelib::*
odbc connect sql MYDB scott tiger
# create in-memory table
SQL "create table dc_custom (domain varchar(32), name varchar(32), \
 description varchar(200), value varchar(2000), primary key(domain,name))"
# load from persistent database
set reply [SQL_tableload dc_custom]
puts "rows processed, rows inserted=$reply"

AUTHOR

Hume Integration Software, www.hume.com

SEE ALSO

odbc  SQL insert 

KEYWORDS

database, datahub, ODBC, SQL