The Hume Integration Datahub

This document is best viewed from the Hume Datahub SDK documentation which provides a hyperlinked table of contents document in the frame immediately to the left of a larger frame where this document is displayed.

(C)Copyright 1996, 2008 Hume Integration Software
All Rights Reserved
This document may not be reproduced or redistributed without prior written permission of Hume Integration Software. Licensed users of HIS provided software have permission to reproduce or electronically distribute this document to support their usage of the HIS software.


The Hume Datahub is an in-memory, low-latency database, a subscription server, a configurable Tcl/Tk interpreter, and a Distributed Message Hub (DMH) server. The synergistic combination of these capabilities is the cornerstone of a new architecture for distributed applications.

As a database server, the Hub provides the familiar programming model of relational database tables using a subset of Structured Query Language (SQL). A graphical user interface is available to display and manipulate table data either remotely as a client, or as part of the Hub process when it is not running in the background.

As a subscription server, the Hume Datahub provides asynchronous notification to Client processes whenever data that meets their selection criteria is inserted, updated, or deleted. These notifications can be standard SQL messages which are useful for data replication to other Hume Datahubs, or for data replication to commercially available persistent databases, such as Oracle or DB2. Other subscription options enable the developer to execute user defined Tcl procedures within the Hub process, or to obtain notification messages in a format that is designed for use by Tcl list manipulation commands.

When used as a DMH message server, a Hume Datahub becomes the hub of an efficient, event-driven, distributed application. The Tcl interpreter provides a high-level, dynamically customizable programming environment with comprehensive features. Client processes can easily exchange peer messages, can easily share common data in relational tables, and can subscribe and respond in realtime to changes in the application data. Shared application logic can be executed within the Hume Datahub process, sharing the same address space with application data tables and SQL command processing logic. The requirement for interprocess communication and the substantial overhead that it entails is reduced drastically compared to other approaches.

Usage Concepts

The capabilities of the Hume Datahub are built into a general purpose Tcl/Tk interpreter, where they are accessed along with the multitude of other Tcl/Tk commands. From within Tcl code, a developer uses the SQL Tcl command with an SQL statement as a single argument to access the data management functionality.

The Tcl programming language has features such as the namespace command to support combining new commands like the Hume provided SQL command with commands from other packages. When you use the Hume Datahub features, typically the command package require dmh and the command dmh_import have been executed during initialization to make the dmh package commands such as SQL available in the global namespace of the interpreter. If the dmh package commands have not been imported into the global namespace, then the command names must be invoked using the namespace prefix, for example, ::dmh::SQL would be used instead of SQL.

In normal use, a Datahub is also participating in a DMH messaging group. In this usage, the Datahub receives DMH messages sent to its command mailbox, and executes the messages as SQL statements. Usually these command messages are standard SQL statements such as the SQL insert statement or the SQL select statement. For convenience to its message system clients, the Datahub accepts and processes certain statements that are not standard SQL, which allow the clients to access the Tcl commands and features of the Datahub. For example, a client can use eval to execute Tcl code such as procedure evaluations.

The Graphical User Interface (GUI) of the Datahub provides features to display and manipulate table data. Also, the GUI provides windows for the input of Tcl commands or SQL commands. The GUI is written in Tcl code, and dynamically loaded into the Tcl/Tk interpreter using the same techniques that you use to create your own applications.


Database Management

The Datahub uses a subset of ANSI standard SQL for database management. The basic SQL commands are supported for operations on one table per SQL statement. Table joining, grant and revoke, commit and rollback, creation of views, or data integrity constraints are not directly supported by the SQL command. However, because the SQL functionality is imbedded in a Tcl/Tk interpreter, it is possible to provide equivalent functionality to many of these advanced SQL features by writing Tcl procedures.


SQL Syntax

Table names and column names can have 1 to 31 alphanumeric or underscore characters. They should start with a leading alphabetic character or underscore. You are able to use either alphabetic case for table and column names or for SQL keywords in your SQL statements. The Datahub retains the original alphabetic case of table and column names from the table create statement, and uses this for query replies.

SQL character strings must be delimited with single quotes, even when they are a single character. To imbed single quotes in a character string, double them, as in the example: 'Here''s Johnny'.

The new Datahub is able to convert data types in SQL statements and expressions, between the basic integer, floating point, or character types. For example, the string constant '43' is acceptable as the value of an integer or as the floating point value, 43.0. Similarly, the numeric value, 3.14, if used in context where a character value is expected, will be treated as the character string '3.14'.

There is sufficent information in the reference section of this document to understand the syntax of SQL statements that are accepted by the Data Hub. If you are new to relational databases, you may wish to obtain additional information on table design and data normalization.

Database tables are created using the SQL create command, and destroyed using the SQL drop command. Each column of a table is a specific data type such as INTEGER. The supported data types are described with the create command.


Using Key Columns

When a table is created, one or more columns can be specified as primary key columns. When key columns are used, insertion of duplicate rows is not allowed. Said another way, you cannot insert a row into a table if a row already exists in the table with the same values of the key columns. In addition to preventing duplicate rows, keys are used to speed the lookup of rows, when processing the SQL insert, update, delete, or select commands. If all of the values of the key columns are specified, the specific row is located directly.

Using key columns also causes the select command to return row data in sorted order. In contrast to ANSI standard SQL, the sorted result is obtained by default, without using an ORDER BY clause in the SQL select command. If the keys are VARCHAR data types, the sorted order will be a proper ascending sequence of the primary key columns. If numeric data types are used as keys, the ordering is not as useful, and you may want to use the Tcl lsort command on the selected rows to obtain a properly sorted result.

Data is added to a table one row at a time using the SQL insert command. The SQL select command is used to query existing data. The SQL update command is used to modify stored data, and the SQL delete command is used to delete data rows.


The Where Clause

When using the select, update, and delete commands you optionally provide a WHERE clause which specifies selection criteria for the affected rows. Also, a WHERE clause can be specified when using the SQL open command to establish a subscription only for rows meeting the WHERE clause selection criteria. A WHERE clause consists of one or more comparisons between a column value and a constant, or two column values. All of the usual SQL comparison operators are supported: equal (=), not equal (<>), less than (<), less than or equal (<=), greater than (>), and greater than or equal (>=). Compound conditions are expressed using the NOT, AND, and OR operators. Parentheses may be used to group terms. All of the comparison operators have equal precedence, and their precedence is higher than the NOT, AND, and OR operators. So, if you write

        select * from equipment where machine='hx14' or machine='hx15'

it is equivalent to

        select * from equipment where (machine='hx14') or (machine='hx15')

since the OR operator binds less tightly than the comparison operator =.

NOT has a higher precedence than AND which has a higher precedence than OR. If you write

    select * from equipment where
class='hx' and not color='green' or machine='zq2'

it is equivalent to

    select * from equipment where
((class='hx') and (not (color='green'))) or (machine='zq2')

Your WHERE clauses can also use the LIKE comparison operator to compare a VARCHAR field to a string constant pattern. Standard SQL uses the % character as a wild card that matches 0 or more characters, and the underscore, _, as a wild card that matches 1 character. So if you write

    SQL "select * from equipment where machine like 'hxq%'"

you will select machine values such as 'hxq14'.

The LIKE operator has been implemented using the string match functionality of Tcl. If you look at the reference for the string match command, you will see that it has more features than the simple matching characters of the standard SQL LIKE; for example, you can use character classes such as [0-9] to represent any number. These powerful features have not been disabled, and you are able to use them in your LIKE operator patterns. However, these features are not standard SQL, and you should consider that these features are not portable to other SQL databases. The other implication of not disabling these features, is that if your LIKE pattern contains the special characters *, ?, or [ and ], you need to escape them using the backslash character to have them treated as regular characters. You may also precede the SQL wildcard characters % and _ with the backslash character in order to force them to be interpreted as ordinary characters.



Table data subscriptions are opened to have notification messages sent any time specified table data is changed, or, to have a Tcl procedure executed any time specified data is changed. There can be any number of subscriptions on each table. For example, you may wish to have each user interface open a subscription to a table that holds active alarm data so that a message is sent to every user interface when a new row representing a new alarm is inserted.

Another example use of a table data subscription is to perform realtime analysis on new measurement data using a Tcl procedure that executes in the Datahub. The analysis code is activated without the overhead of interprocess communication. The logic that produces the measurement data, whether it be an equipment interface, or user interface, is developed, tested, and deployed independent of the analysis logic.

The SQL open command is used to create a table data subscription. Each use of the open command creates a subscription to one or more columns of a specific table for all rows which match specified selection criteria. Options to the open command indicate whether the subscription notifications execute a Tcl procedure, or whether the notifications result in messages being sent. There is also a SYNC option that causes notifications to be activated for all of the existing data matching the selection criteria at the time the subscription is opened. The SYNC feature makes it easy for a client to synchronize with the current and future states of the application data. The ease of meeting this application requirement stands in marked contrast to other CIM products such as CELLWORKS dmbx, DEC Message Queue, or TIBCO Rendezvous, which offer broadcast or subscription based message propagation without any history.

The SQL close command is used to remove table data subscriptions.


DMH Message System Integration

The Datahub has a small set of data items that are assigned values using the SQL set command or queried using the SQL query command. These data parameters are used to configure specific aspects of the Datahub's behavior and are described under the set command.

The GROUP parameter specifies the DMH message group name that the Datahub will use when initializing DMH message communication. The value defaults to "mbx". The SERVER parameter specifies the mailbox name that the Datahub will use to receive SQL message commands from the DMH messaging system. The value of the SERVER parameter defaults to "DATAHUB". The SQL EOF command is used to initialize message system communication. Once message communication has been initialized, the values of the SERVER and GROUP parameters cannot be changed. It is typical to specify these parameters, and to initialize DMH message communication on the Datahub's command line. A typical startup might also use the SQL source command to read and process some Tcl initialization code, such as statements to build tables and load data. Thus, an invocation of a Datahub might look like:

datahub "set server=DBM" "set group=spc2" "source hub_init.tcl" eof &

Command Line Usage

The capabilities of the Datahub are built into a general purpose Tcl/Tk interpreter, dmh_wish which has its own document describing its command line options. These command line options are available to the Datahub.

Typically, the Datahub is invoked from a script file; namely datahub85, hub85, or datahub85.bat. On your system, you may find these scripts also installed under the version independent name datahub or datahub.bat. The script file configures the underlying interpreter so that the command line arguments which are not processed as interpreter options, are processed as SQL commands as described in the SQL Command Reference section of this document. One of the important dmh_wish command line options is -notk. This option starts the Datahub in background, without it communicating with the Windowing system. This is the way you want to run a production application, so that you can logout without shutting down the Datahub. Also, when running for production on Unix, you should start the Datahub in a shell that provides protection against the SIGHUP signal at logout such as the ksh, the csh, or the bash shells. If you use the older Bourne shell you may need to use the nohup command.

You typically do not use the -notk for development work. In this usage, the datahub script file creates the user interface to the Datahub which is invaluable to a developer.

Whether you run the Datahub in background or not, you can also run the hubclient application on any X-display or any Windows NT display in your TCP/IP network, in order to provide the same Datahub user interface. You are able to run as many instances of this application as you choose.

The Datahub user interface is a tool for developers and not for application users, because it makes it easy to drop tables, delete rows, terminate client applications, or make other potentially disruptive changes.

The hubclient can be started with an argument that specifies the SERVER mailbox and the DMH GROUP of the target Datahub. For example, to provide a user interface to a Datahub running on host malibu, with the SERVER mailbox of DBM, and the DMH GROUP of sta1, enter

hubclient DBM@malibu:sta1 &
at the command prompt. Windows NT users do not enter the background operator &. If you do not supply a command line argument, the hubclient application will prompt you for one.

The hub85 script configures the Datahub so that it does not become a DMH message system server. You use the hub85 script when you wish to run additional instances of Datahubs in a DMH message group, and another process in the group is already functioning as the DMH server. In other words, you should select a single Datahub to be the DMH server for a given group, and invoke it with the datahub script. Subsequent Datahubs that will join the same DMH group should be started using hub85. See the mbx command for more information on message groups.


Datahub Tcl Commands

The following Tcl commands have been implemented as part of the Hume dmh package in order to provide Hume Datahub functionality. Almost always, these commands have been imported into the global namespace of the Tcl interpreter from the ::dmh namespace. The importing occurs when the user runs the datahub85 or hub85 application scripts. Custom applications that do not use these scripts can have the same convenient global access to the commands by calling dmh_import or by directly using the namespace import command. The package commands can be accessed in their native namespace by prepending ::dmh:: to the command name. For example, the SQL command is always accessible as ::dmh::SQL.

SQL execute gs_execute hub_fieldtype
hub_html_save hub_schema_html_data hub_schema_select hub_select_html_data
hub_select_to_html hub_select_to_xml hub_subs_close_all hub_table_csv
hub_table_exists hub_table_list hub_table_refresh hub_table_schema_list
hub_table_sub_list hub_tables hub_tables_load_all hub_tables_save_all
hub_templates hub_xml_sanitize


SQL - Execute an SQL Statement

Tcl Command Syntax
SQL sql_command
The SQL command processes the sql_command argument as described in the next section SQL Command Reference. This command is the workhorse Tcl command that you use in your custom Tcl procedures in order to access database functionality.


execute - Execute an SQL Statement

Tcl Command Syntax
execute sql_command
The execute command exists for compatibility with Tcl code that was written for earlier versions of the Datahub. New Tcl code should use the SQL command to access database functionality of the Datahub.

The Tcl execute command is nearly synonymous with the Tcl SQL command. The command processes the sql_command argument as a Data Hub SQL command as described in the next section. The return value is a Tcl list where each element of the list is a reply message that would be obtained if the argument were sent as a message command to the Data Hub's command mailbox. In other words, since most of the Data Hub SQL commands return a single reply message, most of the return values will be formatted similar to "{//c 0 COMMAND: success}".

The execute Tcl command differs from the SQL Tcl command in the format of the reply for the select and the telect SQL statements. For both these statements, the SQL command reply uses the Tcl Reply Format so that the query data is easily manipulated using Tcl list commands. In contrast, the execute select statement reply is a list of "SELECT: ..." messages with a final "//c" legacy style message. See the select description for details.

The difference with the execute and SQL "telect" statement replies is subtle. The execute command returns the Tcl Select Reply Format but it is packaged as a one element Tcl list. Therefore, it is always true that the return value of the execute statement is a Tcl list where each element of the list is a reply message that would be obtained if the argument were sent as a message command to the Datahub's command mailbox.


gs_execute - Process an SQL Message

Tcl Command Syntax
gs_execute mbxdest mbxmsg mbxreply

This Tcl command is used internally by the Datahub software when it receives SQL command messages from its command mailbox. The command causes the mbxmsg argument to be processed as an SQL command, and causes reply messages to be sent to mailbox mbxreply when appropriate. When message communication is initialized with the SQL EOF command, Tcl code similar to:

mbx whenmsg $SQL(SERVER) { mbx whenmsg again ; 
gs_execute $mbxdest $mbxmsg $mbxreply }
is executed to establish the basic processing of command messages.

Usually, the application developer need not be concerned about the gs_execute command. However, there are three circumstances that we can think of where you may want to use this command.

First, you can use this command in order to process SQL commands that are received in a different mailbox than the main server command mailbox. Just execute your own "mbx whenmsg" statement specifying the mailbox name that you wish.

A second use for the gs_execute command is to modify the basic DMH command processing loop of the Datahub. For example, in a debugging situation, you could print every statement being processed by executing:

mbx whenmsg $SQL(SERVER) { mbx whenmsg again ; puts $mbxmsg ;
gs_execute $mbxdest $mbxmsg $mbxreply }

Finally, in rare circumstances you may wish to re-arm the Datahub for re-entrant execution of SQL message commands. Ordinarily, you will not receive another DMH message in your command mailbox if gs_execute has not completed processing a prior message. This is a desirable characteristic of the DMH message system since it saves you from having to worry about re-entrant execution of your application code. However, there are circumstances where re-entrant execution may be desired. For example, a Datahub may receive a message that causes it to query another application for some data. In turn, the other application needs to query the Datahub for related data before sending a reply. To re-arm itself for re-entrant execution, the Datahub needs to execute the full statement

mbx whenmsg $SQL(SERVER) { mbx whenmsg again ; puts $mbxmsg ;
gs_execute $mbxdest $mbxmsg $mbxreply }
while it is "busy" processing the original message.

This contrived example does require re-entrant execution, but in truth its a more insightful example of poor design. The application should not require the Datahub to perform a synchronous query of another application in order to return a synchronous reply. The design should be reworked so that the processes pass along enough context information in order to process the same work asynchronously. The example problem can also be sidestepped by having the Datahub process message commands received in another mailbox, and having the other application use this alternate mailbox for its nested queries.


hub_fieldtype - Return a Column Datatype

Tcl Command Syntax
hub_fieldtype Column Schema_list
Given a Schema_list returned from using hub_table_schema_list and the name of a column Column, this Tcl procedure returns the datatype of the column such as "varchar", "varchar(20)", "int", "float" or "double".


hub_html_save - Write data or schema as HTML documents

Tcl Command Syntax
hub_html_save [one_doc [data [schema [contents [schema_description [table_list [pathname_or_dir]]]]]]]
This procedure writes table data or table schema information from one or more tables to 1 or more HTML documents. The command is used to implement the HTML Save feature of the Datahub GUI, and it can be a useful procedure for your applications. The one_doc, data, schema, contents, and schema_description arguments are all used as boolean with values 1 or 0 as follows:
If 1, output will be to a single file, and the pathname_or_dir argument if provided, specifies the output pathname. Defaults to 1.
If 1, table data will be output. Defaults to 1.
If 1, table schema information will be output. Defaults to 1.
If 1, a Table of Contents with hyperlink references will be written into the output file(s). Defaults to 1.
If 1, the schema tables will have an additional column Description which will be empty. This option is used to help prepare documentation. Defaults to 0.
The default value creates output for every hub table.
If this argument is supplied, it is used as the output pathname if one_doc is 1, otherwise it specifies the output directory. The default filename when one_doc is 1 is snapshotYYYYMMDD_HHMMSS.html, where substitutions are made for the current date and time. For output to multiple documents, the filenames are written as tablename.html. The default directory in both cases is the current working directory.


hub_schema_html_data - obtain schema data as HTML

Tcl Command Syntax
hub_schema_html_data tablename [Description [fid]]
This command writes an HTML table describing the schema of tablename. If passed a handle to an open file as fid, the command writes directly to the file; otherwise the HTML is the returned result. The Description argument is used as a boolean to indicate whether an additional column labeled Description should be output. This option may be used to help create documentation.


hub_schema_select - Obtain the schema of a table as a selection result

Tcl Command Syntax
hub_schema_select tablename [Description]
The command provides the schema of a table as if it is a Tcl Reply Format selection result. The result includes spaces and mixed case column names which you would not get from an ordinary table selection. Set the Description argument to 1 to obtain a blank Description column which may be desired for creating HTML documention. The provided columns are Column Name, Key, and Type and then optionally Description.


hub_select_html_data - Obtain table data as HTML

Tcl Command Syntax
hub_select_html_data table [fid [columns [where_clause]]]
Selects data from a table using the SQL command. Converts the data to HTML. If passed the handle to an open file, writes to the file otherwise returns the HTML data as the result. The default value of the columns argument selects all the columns. You can optionally specify a list of desired columns. The default value of the where_clause argument selects all the rows. When provided, the where_clause should include the where keyword.


hub_select_to_csv - Convert Selected Data to CSV Format

Tcl Command Syntax
hub_select_to_csv selection [write_headers]
This procedure is used to convert a Datahub or ODBC selection result to the popular Comma Separated Value (CSV) format. The procedure return value is a list of text lines without newline delimiters. The optional argument write_headers defaults to 1 which causes a comma separated list of column names to be included as the first list element of the return value.


hub_select_to_html - Convert the Tcl Reply Format to HTML

Tcl Command Syntax
hub_select_to_html selection [empty_string]
Convert a Datahub or ODBC selection result to HTML. The HTML result is also proper XML. This format uses the table rendering tags of HTML and displays properly in a browser. You may want to set the empty_string argument to &nbsp; to have empty table cells displayed as recessed.

Here is an example of the format. The table below:

name description
3sigma Western Electric #1 - one point outside of +/- 3 sigma
3sigmaH One point higher than + 3 sigma
3sigmaL One point lower than - 3 sigma

is rendered from the HTML data:

<table id="dc_runrules" border="1"><tr> <th>name</th><th>description</th></tr>
<tr><td>3sigma</td><td>Western Electric #1 - one point outside of +/- 3 sigma</td></tr>
<tr><td>3sigmaH</td><td>One point higher than + 3 sigma</td></tr>
<tr><td>3sigmaL</td><td>One point lower than - 3 sigma</td></tr>


hub_select_to_xml - Convert the Tcl Reply Format to XML

Tcl Command Syntax
hub_select_to_xml reply [header]
Convert a Tcl Reply Format result to XML. The input format is from a selection, or a subscription notification. The tags make no sense to a browser, so this format is intended for XML parsing software. Setting the header argument to 0 suppresses having the string <?xml version="1.0" encoding="UTF-8"?> as the first part of the result.

Here is an example of the XML format. The table data that resulted in the HTML table shown in the previous procedure was also converted to the following XML data:

<?xml version="1.0" encoding="UTF-8"?>
<row><name>3sigma</name><description>Western Electric #1 - one point outside of +/- 3 sigma</description></row>
<row><name>3sigmaH</name><description>One point higher than + 3 sigma</description></row>
<row><name>3sigmaL</name><description>One point lower than - 3 sigma</description></row>


hub_subs_close_all - Close All Subscriptions

Tcl Command Syntax
Calling this Tcl procedure closes all the subscriptions on all the tables of the Datahub.


hub_table_csv - Write Data in CSV Format

Tcl Command Syntax
hub_table_csv table [pathname [column_list [where_clause]]]
This procedure is used to write selected data of a table to a file in the popular Comma Separated Value (CSV) format. The procedure will overwrite without warning an existing file of the same name. The default pathname is the table name concatenated with ".csv" in the working directory. You may optionally specify selected columns as the column_list argument. The default is to output all columns. You may specify a row selection criteria as the optional where_clause argument. The default behavior is to select all rows. The where_clause argument, if used, should be a complete SQL select where clause, including the literal where keyword.


hub_table_exists - Check For Table Existence

Tcl Command Syntax
hub_table_exists table
Calling this Tcl procedure returns 1 if the table named table exists or 0 if it does not. The alphabetic case of table does not matter.


hub_table_list - List Table Statistics

Tcl Command Syntax
Calling this Tcl procedure returns a list of {table row_count subscription_count} records for each table. The field table is the table name, row_count is the current count of data rows, and subscription_count is the current count of subscriptions on the table. Use the hub_tables procedure to obtain a plain list of table names.


hub_table_refresh - Refresh the Datahub GUI Main Window

Tcl Command Syntax
Calling this Tcl procedure updates the Datahub GUI main window. It is an error to call this procedure if the Datahub is executing in the background without a main window.


hub_table_schema_list - Query a Table's Schema

Tcl Command Syntax
hub_table_schema_list table
Calling this Tcl procedure returns a list of {columnname datatype key_number} records for each column in the table. The key_number item is an empty string if the column is not a key.


hub_table_sub_list - List Table Subscriptions

Tcl Command Syntax
hub_table_sub_list table_list
Given a list of 1 or more tables, this Tcl procedure returns a list of subscription records where each record consists of {table subscription_name open_statement}


hub_tables - List Existing Tables

Tcl Command Syntax
This Tcl procedure returns a list of existing tables.


hub_tables_load_all - Read Saved Table Files

Tcl Command Syntax
This Tcl procedure attempts to load all of the files in the current directory that match the pattern *.tab using the read over command. The assumption is that they store a set of tables that were saved using the write command or the hub_tables_save_all procedure.


hub_tables_save_all - Save All Tables

Tcl Command Syntax
This Tcl procedure writes all existing tables as .tab files in the current working directory using the write over command.


hub_templates - Provide template SQL statements

Tcl Command Syntax
hub_templates table [format]
This Tcl procedure returns a list of programming statements which are models for custom development. The optional format argument defaults to tcl which means that the statements are Tcl code. In this case, a list of six elements is returned. The list consists of a create statement, an insert statement, an update statement, a delete statement, template selection code, and a column name assignment statment. The value of sql is also supported for the format argument, in which case the template programming statements are SQL. In this case the procedure return value is a list of five SQL statements; the create, insert, update, delete, and select statements. This procedure is used by the Datahub user interface to provide the developer with cut-and-paste statement templates to facilitate software development.


hub_xml_sanitize - Replace special XML or HTML characters

Tcl Command Syntax
hub_xml_sanitize data
This command uses regsub to substitute &tag; entities for occurences of &, <, >, and ". The substitution should be done to prepare correct XML and HTML document data.


SQL Command Reference

In this section, the commands that are given as arguments to the Tcl SQL command, or to the Tcl execute command are presented. These are also the commands that are processed when sent as messages to the command mailbox of the Datahub.

The new Datahub retains reply format compatibility with the SCC Data Server program was developed by SEMATECH. In general, a reply is formatted as "//c rc COMMAND: text". In this format, a return code, rc, of 0 means that the command completed successfully. A non-zero code implies an error or warning condition.

A blank statement is treated as comment text. Comment commands are exceptional in that reply messages are not sent when the comment commands are received through the message system.

If you execute an SQL statement that is not supported, or execute some other unrecognized statement, the reply is:

//c 2 <UNKNOWN>:  Unrecognized command, enter "help" for command list.

These commands also retain compatibility with the SEMATECH SCC Data Server program in other ways that are not explicitly documented for each command, because they are deprecated for future use. See the section on Compatibility.



# comment
Syntax # comment text
# These statements build version 2 tables
# Send change requests to
Description This command lets you lets you add comments to files of SQL commands that will be processed using the SQL include command. The # command and the rem command are equivalent.
Replies none


CLOSE Close a subscription
Syntax CLOSE SUB subname TO table

SQL "close sub my_sub to equipment"

Description This command removes the subscription named subname from the table named table. Use dump, or the Datahub GUI to display existing subscriptions.
//c 0 CLOSE:  subname - Subscription closed to table table.

//c 5 CLOSE:  table - table not found"
//c 5 CLOSE:  Table table record is corrupt"
//c 5 CLOSE:  subname - subscription not found in table table.


CREATE Create a new database table
Syntax CREATE TABLE table ( colname datatype [, colname datatype]* [, PRIMARY KEY(colname [, colname]*)] )

datatype := { VARCHAR[(n)] | INT[TEGER] | FLOAT | DOUBLE [PRECISION] }

n := non-zero unsigned integer

SQL "create table spc_type (
typecode varchar,
description varchar(80),
primary key(typecode))"
Description The SQL create statement creates an empty database table.

Each column of the table is a specific datatype:

Corresponds to a C language integer, typically a 32-bit signed integer, with the range of -2147483648 to 2147483647.
Character strings are stored as arrays of 8-bit values. With VARCHAR(n), you specify n as the maximum number of characters possible for the field. The VARCHAR type declaration is used to store a single character. The Datahub only uses as much memory as is actually needed to store your data, so there is no added overhead with specifying generous array sizes.

You can store and retrieve all of the usual printable ASCII characters and the extended characters ranging from binary codes 0x80 to 0xff. Also, you can get away with storing and retrieving unprintable characters except for the null byte 0x00 and its successor 0x01 which are used internally by the Datahub software. Although it is less compact, you can always store text representations of binary codes such as the literal sequence \x00.

The current Datahub stores these types using IEEE 8 byte double precision floating point variables with a range of absolute values of >= 2.225073858507201e-308 and <= 1.7976931348623158e+308. Most databases store the FLOAT type as 4 byte single precision floating point values with a smaller range; this may be a portability issue. See the discussion of the USE_TCL_PRECISION parameter for information on how to control the number of digits that are displayed when values are queried.

There is no set limit to the number of rows that are stored in a table. If you will be querying tables that have thousands of rows, you should design your application to use primary keys so that performance scales well.

The Datahub itself imposes a limit that the total number of characters in an SQL command has to be less than 70,000 and the number of lexical tokens in an input line has to be less than 2500. This imposes a generous limit on the amount of data that can be stored in a single table row. These figures also imply that the maximum number of columns is limited to some value over 1200.

This is no fixed limit on the size of a query result, other than a limit of 70,000 on any single row of table data. If your operating system and TCP/IP implementation can handle it, you can successfully manipulate multiple megabyte queries.

Use dump, or the Datahub GUI to display existing tables. Use drop to delete tables.
//c 0 CREATE:  table - Table created.

//c 5 CREATE:  table - table already exists
//c 5 CREATE:  bad data type for column colname
//c 5 CREATE:  incomplete statement
//c 5 CREATE:  bad array size for column colname
//c 5 CREATE:  expected ) to close array size for column colname
//c 5 CREATE:  table - column "colname" already specified.
//c 5 CREATE:  table - missing KEY after PRIMARY
//c 5 CREATE:  table - expected ( after KEY
//c 5 CREATE:  table - expected key column name
//c 5 CREATE:  table - declared key colname unknown
//c 5 CREATE:  table - colname duplicate key
//c 5 CREATE:  extra tokens after closing )
//c 5 CREATE:  improper column name (a keyword perhaps?)


DELETE Delete rows from a table
Syntax DELETE FROM table [WHERE where_criteria]

SQL {delete from captains_log where stardate<'1996-07-05 00:00:00'}

delete from my_table

SQL "delete from my_table where my_col like 'tmp%'"

Description This command deletes rows that are selected by the where_criteria from the table named table. If no WHERE clause is supplied, all of the rows are deleted. See The Where Clause for the grammar of the where_criteria argument.
//c 0 DELETE:  table - 1 row deleted.
//c 0 DELETE:  table - N rows deleted.

//c 3 DELETE:  table - 0 rows deleted.

//c 5 DELETE:  table - table not found"
//c 5 DELETE:  Table table record is corrupt"
//c 5 DELETE:  syntax error description


DROP Destroy an entire table
Syntax DROP TABLE table

drop table report_temp

SQL "drop table report_temp"

Description This command destroys the named table, freeing the memory resources it was using. Subscriptions on the table are closed as it is dropped.

It is more efficient to empty a table using delete than to drop and re-create it.

//c 0 DROP:  table - Table destroyed.

//c 5 DROP:  table - table not found"


DUMP Obtain status information
Syntax DUMP { TABLE | SCHEMA table | SUB table }

dump schema repair_log

Description There are three variations of the DUMP command:
returns a list of all the existing tables, and shows the number of rows and columns of each.
returns a list of the columns of the table named table.
DUMP SUB table
returns a list of the subscriptions on the table named table.
This command is almost never used directly since the same information is presented using the Datahub GUI or the hubclient GUI. For new Tcl code, we highly recommend that you use the hub_* Tcl procedures to obtain the same information in forms that are better suited for program use. See for example, hub_tables, hub_table_list, hub_table_exists, hub_table_schema_list, or hub_fieldtype.
multiple replies of status information followed by
//c 0 DUMP: Complete

//c 5 DUMP:  table - table not found"


ECHO send the argument as a reply
Syntax ECHO echo_text

echo hello world

Description This command exists for compatibility with the older versions.


EOF initialize message system communication
Syntax EOF

datahub "set group=spc2" eof &

Description Executing this command causes the Datahub to initialize message system communication, and thereafter to process messages received in the server mailbox as SQL commands.
//c 0 EOF:  Message system initialized, command mailbox is \


EVAL evaluate Tcl code
Syntax EVAL tclcode

eval user_login jsmith rx7fan

eval exec xterm -display atlas:0.0 &

eval SQL "update checking set balance=12 where acct='smith'" ; SQL "update savings set balance=24 where acct='smith'

eval proc hello {a} {puts $a}

Description This command provides a means for message system clients to access the Tcl functionality of the Datahub. As the examples show, it can be used to execute Tcl procedures, or even to execute more than 1 SQL command in a single message.
Replies If you are accessing the eval command with the DMH message system the reply message is exactly the result you would see if the same tclcode were executed at a Tcl/Tk shell prompt.

When you access the eval command in Tcl code using the SQL command, the result is always returned as a single list element, which means that braces are added if the result contains internal white space. Here are two examples to demonstrate the difference, which can be subtle.

"eval set x hello"
Reply message is hello
SQL "eval set x hello" result is hello.
"eval set x {hello world}"
Reply message is hello world.
SQL "eval set x {hello world}" result is {hello world}


EXIT shutdown the Datahub
Syntax EXIT


Description This command causes the Datahub to exit. In contrast to older versions of the Datahub, delete and drop messages are not sent to subscribers during the processing of the exit.
//c 0 EXIT:  orderly shutdown initiated


HELP display command usage
Syntax HELP [command_name]


help create

Description This command returns the syntax of an SQL command or of all the SQL commands.
command synopsis
//c 0 HELP: Complete, '*' indicates disabled.


INCLUDE process a file of SQL commands
Syntax INCLUDE pathname

include c:/users/tonka/xtc.sql

include ../lib/report1.sql

SQL "include ../lib/report1.sql"

Description This reads and processes a file of SQL commands. The logic reads each text line in the file as a new SQL command except if the ending newline occurs inside of an SQL string constant or inside of an unfinished sequence of Tcl code. In the latter cases, additional lines are read until the SQL command is complete.

Relative pathnames are with respect to the current working directory of the Datahub process. The current working directory of the Datahub can be changed using the Tcl cd command, or by setting the CWD parameter.

Even on Windows NT platforms, the pathname is specified using Unix style slashes.

The INCLUDE command may be nested inside of included files to arbitrary depths.

a reply for each included command  
possibly intercalated with:
[Error at line N in file "pathname".]
followed by:
//c 0 INCLUDE: Completed file "pathname" N errors encountered.
//c 0 INCLUDE: Completed file "pathname" 1 error encountered.

//c 1 INCLUDE: Unable to open file "pathname"
//c 2 INCLUDE: Usage - INCLUDE


INSERT add a new row to a table
Syntax INSERT INTO table [(colname [, colname]* )] VALUES (value [, value]*)

insert into spc_type (code,description) values ('X','XBar, S')

insert into system_constants (name, value) values ('USE_BLIVETS', '1')

SQL {insert into spc_type (code,description) values ('R','XBar, R')}

Description This is a standard SQL syntax for adding a data row to a table. Columns that are not specifically named are given the default values of 0 for numeric fields and the empty string, '', for character data.  

Datahub versions after October 2010 allow for the list of column names to be omitted.  If a list of column names is not supplied, then the list of columns from the table create statement is implied, and the number of supplied values must match the number of table columns.  

As a portability issue you should know that some persistent databases do not allow empty strings to be inserted for character fields that are primary key columns.

//c 0 INSERT:  table - Record inserted.

//c 1 INSERT: table - column "colname" not in table
//c 3 INSERT: Unable to insert. Unique key violation
//c 5 INSERT: table - table not found.
//c 6 INSERT: Table table record is corrupt
//c 7 INSERT: bad value for column colname
//c 9 INSERT: improper value supplied for column colname
//c 10 INSERT: cannot convert string value to number for column \
//c 13 INSERT: memory allocation failed


OPEN create a new subscription on a table

For a message subscription:

OPEN SUB subname TO table [SENDTO=mailbox] [REPLYTO=mailbox] [TELECT [ARGDATA=data]] [INSERT] [UPDATE] [DELETE] [DROP] [SYNC] [CREATE] { * | colname [, colname ]* } [WHERE where_criteria ]

For a procedure subscription:

OPEN SUB subname TO table { PROC=procedure | SQLPROC=procedure } [ARGDATA=data] [INSERT] [UPDATE] [DELETE] [DROP] [SYNC] [CREATE] { * | colname [, colname ]* } [WHERE where_criteria ]
open sub testing to mytable sqlproc=puts *

open sub demo2 to mytable sqlproc=puts col1,col2,col3 where col1>20

open sub ENGDB to shopdata sendto=ENGBD * where engdb_wants=1 and station='test'

proc sub1 {notice_type table cols keys row_ct errtext data} {
puts "table=$table notice_type=$notice_type"
foreach row $data {
vset $row $cols
foreach col $cols { puts "\t$col=[set $col]" }
SQL "open sub sub1 to test_table proc=sub1 insert update\
delete sync *"

proc sub2 {stmt} { puts "SQL statement=$stmt" }
SQL "open sub sub2 to test_table sqlproc=sub2 *"

proc opc_secs_connect_alarm {spname ocname ALID} {
set subname opc_secs_alm_${spname}_${ALID}
SQL "close sub $subname to opc_client" ;# in case we get called again
set argdata [list $spname $ALID]
SQL "open sub $subname to opc_client proc=opc_secs_alm_sub argdata=[list $argdata] \
SYNC update is_disconnected where ocname='$ocname'"
proc opc_secs_alm_sub {notice_type table cols keys row_ct errtext data argdata} {
vset [lindex $data 0] $cols ;# now we know ocname and is_disconnected
vset $argdata {spname ALID}
# signal an alarm set or alarm clear
SQL "update ei_alarm set is_set=$is_disconnected where spname='$spname' and ALID='$ALID'"


The command opens a subscription to the table named table. You can only open subscriptions on tables that have one or more columns designated as primary key columns.

There are two main variations depending on whether you want a message sent for each subscription notification, or you want to have a Tcl procedure evaluated for each subscription notification.

When opening a message-based subscription, you specify the mailbox to receive the notifications using the SENDTO=mailbox clause. The default value of this optional clause is the reply mailbox of the open command message itself. By default, the message notifications will be SQL statements.  You may specify the TELECT keyword in order to receive messages that are formatted in the Tcl Reply Format style.  With the Tcl Reply Format, you are also able to specify optional data using the clause ARGDATA=data.  If this argument is included after the TELECT argument, then an additional list element is appended to the message data list and the value of this element is the data argument that you have specified.  So you can use the ARGDATA option to pass additional context information.

With a message-based subscription, you may optionally specify the REPLYTO=mailbox clause to indicate a reply mailbox for the notification messages. The default value of the REPLYTO mailbox is NULL, which is a special value that means no reply mailbox is applicable. You can use the reply mailbox as a mechanism to indicate which Datahub or which subscription a subscription message came from as long as your REPLYTO mailbox value is acceptable as a single token mailbox name.

When opening a procedure subscription, the SQLPROC=procedure clause is used to indicate that the Tcl procedure named procedure is to be called with a single argument, the SQL notification statement. The alternative clause, PROC=procedure, is used to specify a Tcl procedure that gets called with the 7 arguments of the Tcl Reply Format.  You are able to specify a data value for an additional argument to your subscription procedure using the ARGDATA=data option.  If you specify an ARGDATA clause, then the SQLPROC procedure is executed with 2 arguments, and the PROC procedure is executed with 8 arguments.  In both cases, the additional argument is the last one.

When using the ARGDATA = data option, the data value is formatted as a single argument using the formatting rules of Tcl arguments.  You may use braces { } or double quotes to delimit an argument containing spaces.  Do not use SQL style single quotes.  For example, ARGDATA = {this is correct}, or ARGDATA = "also correct", or ARGDATA = OneTokenCorrect.

You are able to choose the type of data changes you wish to receive notifications for by optionally specifying one or more of the keywords INSERT, UPDATE, DELETE, or DROP. If you do not specify any of these data change types, then you receive notifications for all of them.

The SYNC option causes your subscription to be notified immediately with insert notifications for each row in the table selected by your specified or implicit where_criteria. This option lets you synchronize with the existing data, and in the same action be registered for future changes. The default behavior is only to notify you of future changes.

The CREATE keyword is used to receive an SQL create table statement as the first SQL notification message or first SQLPROC procedure evaluation. The SQL create table statement is for the entire table, not just the columns that you are subscribing to. Using both the SYNC and CREATE options is a convenient way to get all of the SQL statements needed to replicate a table. For example, you can use the SYNC and CREATE options and specify as the SENDTO destination the mailbox of another datahub, to create a copy of a table that stays in synchronization with changes to the original table.  The CREATE option is only used with SQL formatted message notifications - it cannot be used with the TELECT or PROC choices.

You can specify a list of specific column names for your subscription, or use * to indicate all columns. Whether you ask for them or not, all of the key columns for the table are included in your column list, and included in all of your notifications. This is done so that your notifications are unambiguous. You specify a column name to indicate you would like to be notified if a value in that column is updated, or you would like to know the new value of that column when a row is inserted. When a row is updated, update notifications do not include the value of every column name in the subscription's list. Instead, the update notifications include values for all of the key columns, and values for the non-key columns requested by the subscription only if they have been changed in the update. A value must actually be changed to appear in an update notification. For example, if an update statement is processed that specifies updating a field to its current value, no notifications are dispatched.

When opening the subscription you optionally supply a WHERE clause to specify row selection criteria for your notifications. The default is that you are notified for changes to any row. See The Where Clause for the grammar of the where_criteria argument.

Here is a fine point on update notifications. The where clause is evaluated on the updated row data, not on the original row data. If your where clause specifies comparisons on non-key column values, (1) you will see update notifications for rows that are updated to match the where criteria from a previous unmatching state, and (2) you will not see update notifications for rows that are updated such that they no longer match your where criteria.

If the value of a primary key column (or columns) is (are) updated, the Datahub processes two notifications; a delete notification for the original value of all the key columns, followed by an insert notification for the row data with new key column values. This behavior guarantees that if a WHERE clause is expressed only on values of key columns, the subscription notifications synchronize perfectly with the selected subset of the table. If a single update notification were sent, (1) a selective subscription would not be notified when a row is updated to no longer match the WHERE criteria, and (2) a selective subscription would receive update notifications for rows that it did not previously know about, as they were updated to match the WHERE criteria.

There can be any number of subscriptions on a table.

The subscription name, subname, must be unique among all of the subscriptions to table table. Subscription names follow the same conventions as table and column names, except that they can be as long as 80 characters in length. You use the subscription name with the close command to close a subscription.

All of the subscriptions on a table are processed by the Datahub before it replies to the original SQL command that caused the data change(s).

There are certain operations that you should not do in a subscription procedure.

  • You should not drop the table, or close the subscription that is being processed. The new Datahub has internal protection to prevent core dumping if this happens but we advise against doing it. If you must do either of these, do it in an after command so it is carried out after the subscription returns.
  • You should not perform lengthy operations, or perform blocking operations, such as waiting for user input.

Perhaps surprisingly, it is safe in your subscription procedure to perform nested data manipulations, such as updating a row that has just been updated, or deleting a row that has been inserted or updated. If you design a complex mess like this, you need to be mindful that you are possibly breaking the design assumptions of the other subscriptions on the table. Also, the subscription notifications may not be in the proper sequence for all of the subscriptions since the nested notifications are completed before the Datahub finishes with the original notifications. A design alternative to nested data manipulations is to do the data manipulation in an after command so it is carried out after the procedure returns.

//c 0 OPEN:  subname - subscription opened.

//c 1 OPEN: invalid column name
//c 1 OPEN: table - column "colname" not in table.
//c 1 OPEN: SENDTO missing for message subscription
//c 2 OPEN: SENDTO is mutually exclusive with non-message\
//c 3 OPEN: REPLYTO is mutually exclusive with non-message\
//c 4 OPEN: PROC is mutually exclusive with other\
subscription types
//c 5 OPEN: table - table not found.
//c 5 OPEN: syntax error description
//c 6 OPEN: SQLPROC is mutually exclusive with other\
subscription types
//c 7 OPEN: TELECT is mutually exclusive with non-message\
//c 8 OPEN: subname - table table does not contain keys.
//c 8 OPEN: ARGDATA is not supported for SQL message subscriptions
//c 9 OPEN: Insufficient options to determine subscription type
//c 10 OPEN: CREATE option requires SQL notification format
//c 11 OPEN: subname - duplicate subscription name for table table.
//c 13 OPEN: Table table record is corrupt


PING verify online responsiveness
Syntax PING


Description This command exists for compatibility with the older versions.
//c 0 PING:  SERVER date time (POSIX_seconds) Id

//c 0 PING: datahub 08/09/96 17:52:00 (839620320) datahub


QUERY display the value of configuration parameters
Syntax QUERY [parameter]

query group

Description The Datahub has a handful of configurable data parameters. Each is described under the SET command. This command lets you obtain the current value(s). With no argument, all of the values are returned, otherwise the value of parameter is returned. For compatibility with SEMATECH style servers, the parameter = value reply is spaced differently depending on whether you requested a specific parameter or not.
one or more:
parameter = value
followed by
//c 0 QUERY: Complete, '*' indicates disabled.

//c 1 QUERY: Variable "name" not found.
//c 2 QUERY: QUERY [variable]


READ create a populated table from a saved file
Syntax READ [OVER] table

read over lotdata

Description This command restores a table schema and its data from a previously saved image. The saved image is created using the write command. The OVER option is used in order to replace an existing table using the read command.

The read command expects to find a file of the name in its current working directory in order to process table table. The current working directory can be changed using the CWD parameter or the Tcl cd command.

Subscription information is not saved and restored using read and write. When you use the OVER option and replace a table, existing subscriptions are all closed.

//c 0 READ:  table - Table read from file ""

//c 3 READ: table - Error reading file "".
//c 7 READ: table - table exists and overwrite option not set.


REM comment
Syntax REM comment text
rem uncomment the following statements for option B 
Description This command lets you lets you add comments to files of SQL commands that will be processed using the SQL include command. The # command and the rem command are equivalent.
Replies none


SELECT query table data
Syntax SELECT { * | column [, column]* } FROM table [ WHERE where_criteria ]

select help_text from help_config where lot_id='any lot' and step='etch'

select * from capt_log where sdate<='1996-01-01'

SELECT: capt_log event='Klingons near chimney' sdate='1992-12-25'
SELECT: capt_log event='read Dr. Spock''s book' sdate='1994-04-21'
//c 0 SELECT: 2 rows selected

SQL "select * from spc_datatype"

Description select is used routinely to lookup stored table data. You can use the symbol * to specify that you would like all of the column values in your query, or you can explicitly list the columns of interest. If you do not provide a WHERE clause, you obtain all rows of the table. See The Where Clause for the grammar of the where_criteria argument.

If select is executed in Tcl code from the SQL command, you obtain the Tcl Reply Format which is described in its own section. The format described below is obtained when the select statement is sent to a Datahub as a message command, or executed in Tcl code from the execute command.

One or more messages for each row, formatted as
SELECT: table [column=value]+
where value is enclosed in single quotes for character data,
and then a final message,

//c 0 SELECT: table - N row[s] selected

//c 3 SELECT: table - 0 rows selected

//c 1 SELECT: table - column "column" not in table
//c 5 SELECT: table - table not found
//c 5 SELECT: Table table record is corrupt
//c 5 SELECT: syntax error description


SET configure a Datahub data parameter
Syntax SET parameter=value

set server=DBM

Description The Datahub has a small set of configurable data parameters that can be assigned values using this command.
whether to add a semicolon to the end of SQL statements that are generated for subscriptions or saving table data.  This option defaults to 0 (false).   It can be set true to permit direct use of Datahub subscriptions and table files with Sqlite3.
current working directory. New code should use the Tcl cd command.
DMH message group. Once the message system interface has been initialized using eof this parameter cannot be changed. See the mbx document for a discussion of message groups. The value of this parameter defaults to mbx.
the command mailbox. Once the message system interface has been initialized using eof, the Datahub reads messages from this mailbox and processes them as SQL commands. The value of this parameter defaults to DATAHUB.
the mailbox to send TRACE messages. This parameter is provided for compatibility with SEMATECH style servers. There are no TRACE features built into the Datahub software.
floating point format control. When this value is 1, floating point values are formatted using the conventions of the Tcl interpreter which is controlled by the global Tcl variable tcl_precision. When this parameter is set to 0, floating point values are printed using the C language sprintf "%g" convention which can cause loss of precision. The latter behavior is provided to be compatible with the SEMATECH SCC Data Server. The default value of this parameter is set to 1 in the C source code, but it is set to 0 in certain datahub Tcl/Tk scripts for compatibility. If you care, set it to your choice.
//c 0 SET:  Variable "parameter" set to "value".

//c 1 SET: Variable "parameter" not found.
//c 3 SET: Unable to set variable "parameter", error message


SOURCE source a file of Tcl code
Syntax SOURCE pathname

source c:/users/doe/zapp.tcl

source ../library/util.tcl

Description This command is used to explicitly load Tcl code such as procedure definitions into the Datahub. The functionality is equivalent to the Tcl command source.

You use this command repeatedly during development to reload a file of source code that you are editing. However, for production, many applications will take advantage of the interpreter's auto_path mechanism that loads unknown Tcl code only if it is actually executed. To use this mechanism, the developer uses the auto_mkindex command to build a tclIndex file, and adds the directories where the source code files are found to the auto_path list.

Replies The reply is an empty string or a single error message.


TELECT an alias for the select command
Syntax see select

telect * from menu_data

Description This command exists to provide the Tcl Reply Format for message system clients. In the name of compatibility, the reply to select statements received through the command mailbox is the older, one message per row style. Tcl applications can use "telect" instead of "select" to obtain the new format which is designed for ease of parsing in Tcl code.
The Tcl Reply Format


UPDATE change existing rows in a table
Syntax UPDATE table SET column=value [ , column=value ]* [ WHERE where_criteria ]

update fast_food set pace='slow', rate=4.5 where vendor='TurtleJoe'

SQL "update lot_data set is_removed=1 where lot_id='9803021A'"

Description This is the ANSI standard SQL statement for modifying the data of a single table. If you do not provide a WHERE clause, you update all rows of the table. See The Where Clause for the grammar of the where_criteria argument.
//c 0 UPDATE:   table - N row[s] updated.

//c 1 UPDATE:  table - column "column" not in table.
//c 3 UPDATE:  table - 0 rows updated.
//c 5 UPDATE:  table - table not found
//c 5 UPDATE:  Table table record is corrupt
//c 5 UPDATE:  syntax error description
//c 13 UPDATE:  unique key violation


WRITE save a table to a file
Syntax WRITE [OVER] table [pathname]

write over variable_data

write over variable_data /tmp/myfile.sql


This command writes a create statement and insert statements representing a table to a file. The file can be read and the table restored using the read command or the include command.  The add_semicolon option controls whether semicolon statement terminators are written into the file data.

If pathname is specified, it is used as the path and file specification for the table information. Otherwise, the command creates a file by the name of in the current working directory.

//c 0 WRITE:   table - Table written to file "file" - N of N rows

//c 3 WRITE:  table - Table written to file "file" - N of M rows

//c 3 WRITE:  table - Unable to open file "file".
//c 5 WRITE:  table - table not found
//c 5 WRITE:  Table table record is corrupt
//c 7 WRITE:  table - file "file" exists and overwrite option not set.



Tcl Reply Format

The format described in this section is used for Similar formats are also used by other Tcl database software such as the odbc command.

The format is a seven element Tcl list. Each element is now described.

Element Index Description
notification 0 This element will be the literal string select for the return value of a select or telect query. For a subsciption notification, it will be the literal text insert, update, delete, or drop.
tablename 1 This item with be the name of the database table. The name will be formatted exactly as it was in the table's create statement.
columns 2 This element will be a list of the column names in the same sequence as the data items in the data_list element. For a select statement, the columns are in the order of your request, or in the table create statement order if you queried for *. For a subscription notification, the list will always include the primary keys of the table.
key_columns 3 This element will be a list of the column names of the primary key columns.
row_count 4 This element returns a count of the data rows in the data_list element. For the usual insert, update, or delete subscription notification, the value will be 1 because a separate notification is issued for each row.
error_info 5 If your query has an error such as referring to a column or table that does not exist, or a syntax error, this element is a string containing a non-zero numeric return code, followed by a diagnostic message. The return code is the number that you see in the old style "//c number" reply documentation. If you have a valid query that does not find any data rows, it is not an error, and the error_info element is an empty string.
data_list 6 This element is a list of data rows. Each data row has the elements named in the columns element, in the same order. Character strings that have imbedded white space are delimited in Tcl fashion using braces instead of single quotes. There are as many rows as specified by the row_count element.

Here is example code displaying this format:

  # create a table with some data
  SQL "create table a (i int, str varchar(20), d double)"
  SQL "insert into a (i,str,d) values (1,'hello world', 1.1)"
  SQL "insert into a (i,str,d) values (2, 'two', 2.2)"

  puts [SQL "select * from a"]
select a {i str d} {} 2 {} {{1 {hello world} 1.1} {2 two 2.2}}
  set reply [SQL "select str from a where i=1"]
select a {str} {} 1 {} {{{hello world}}}
  # Here is a methodical way to parse the reply for a single value
  set datarows [lindex $reply 6]
  set firstrow [lindex $datarows 0]
  set firstvalue [lindex $firstrow 0]
  puts "str=$firstvalue"
str=hello world
  # As of Tcl 8.4, lindex can take several indicies at once
  # so this is the preferred method to isolate a value.
  set firstvalue [lindex $reply 6 0 0]
  puts "str=$firstvalue"
str=hello world
  # this example is compact and backwards compatible:
  set str [lindex [lindex [lindex $reply 6] 0] 0]
  # The rset command is even more compact
  # but is not more efficient unless you are using several values
  # such as the row count, the errtext, and the data value
  rset $reply "tag table cols keys ct errtext {{{str}}}"
  if { $ct == 1 } { puts "str=$str" }\
  elseif { $errtext != "" } { puts "Error: errtext=$errtext" }

  puts [SQL "select bad syntax from a"]
select {a} {} {} 0 {1 a - column "bad" not in table} {}

  puts [SQL "select * from a where i=200"]
select a {i str d} {} 0 {} {}
You may find the vset Tcl command extremely useful to assign the data of a row to the column names, as in the example:
set reply [SQL "select * from lotdata where lot='x'"]
set cols [lindex $reply 2]
foreach row [lindex $reply 6] {
    vset $row $cols
    puts lot=$lot



BOLD is used to indicate commands, keywords, or punctuation that needs to be entered as written. With the SQL commands, the keywords are written in uppercase for emphasis, but you can enter them in lowercase too. Tcl commands such as SQL and hub_tables are case sensitive and are entered exactly as written.

Italics are used to indicate names or other items where you supply your own instance value. For example, where we have written table you are expected to supply the name of a table.

We use square brackets to enclose items that are optionally specified. For example [WHERE where_criteria] indicates this clause may be omitted. The square brackets themselves are not entered.

When you see the * character following a phrase, it means the phrase is optionally repeated 0 or more times. For example, column [, column]* indicates that one column name must be supplied, followed by zero or more , column phrases.

Finally, the braces are used to show sets of alternate choices, with the choices separated by the | character. For example, { PROC=procname | SQLPROC=procname }. The braces and the | character are part of the notation and are not entered when the command is used.

Compatibility Notes

The new Datahub provided by Hume Integration Software, has been written from "scratch" as a set of C code extensions to the Tcl/Tk interpreter. While the software retains command syntax and reply format compatibility with the SEMATECH SCC Data Server and the extended version developed at Motorola, it does not use any of the source code from these applications, such as the SEMATECH Generic Server. Therefore, there are a substantial number of changes that you may need to be aware of in running your existing applications.

First of all, the new Datahub returns the same reply formats for the ordinary, successful usage of commands, but the error replies are for the most part different. The basic error replies that might be used to test for table existence, such as the "table not found" reply are preserved. However, in most cases, you will find that the error replies of the new Datahub are more detailed and specific, and will help you develop your applications faster. For example, if you try to insert a duplicate row in a keyed table, the old data server replied:

//c 3 INSERT:  Unable to insert.
The new Datahub replies:
//c 3 INSERT:  Unable to insert.  Unique key violation

With the new Datahub, a table that has primary key columns does not allow duplicate rows. This is a change - the old versions allowed duplicate rows unless the keyword NODUPROWS was supplied in the table create statement. For compatibility, the new Datahub accepts and ignores the keyword NODUPROWS. With the new Datahub, if you want a table that allows duplicate rows, you need to declare it without key columns.

The new Datahub allows you to update the values of any columns in an update statement. The old versions would not let you update the values of key columns; you had to delete the row and insert the updated row instead. As explained with the open command, the new Datahub processes subscription notifications of key value updates as both a delete and an insert notification, and therefore preserves interoperable compatibilty with the old versions and existing applications.

There are other changes with the subscription features. The default REPLYTO for a message subscription was changed from the data server's command mailbox to NULL. This makes it safe to feed a message subscription to another Datahub without having the subscription replies coming into the command mailbox if a developer forgets to set REPLYTO.

Also, there is a new message subscription option, TELECT. When this option is indicated for a message subscription, the format used is the Tcl Reply Format instead of SQL statements.

Another change was made to the subscription functionality in order to make it more consistent and easier to explain. The old versions always include key column values on delete and update notifications even when they are not requested. However, insert notifications only provide the columns that are explicitly requested. The new Datahub always includes the key column values, on all of the data change notifications, whether they are explicitly requested or not.

Another improvement is that the new Datahub is able to properly handle newlines when they are imbedded in the data of SQL statements that are read from files using the read or include commands. The old versions use the Generic Server INCLUDE code which reads each line in a file as a separate command. With the older versions, you cannot properly read the data of a table, if a row has an imbedded newline in its data.

The older versions use the eof command to signify both the end of command line or console input, as well as a command to initialize message system communication. The new Datahub uses the eof command only as a command to initialize message system communication. The new Datahub uses the command line, file input, and or console input consistently with other Tcl/Tk interpreter applications.

The new Datahub does not support the old sleep command. You can accomplish the same functionality using the after Tcl command.

The new Datahub is compatible with the old versions in allowing trailing semicolons on SQL commands. This is not ANSI SQL and is deprecated for new applications.

The new Datahub overcomes a very nasty problem with the old versions. When the old versions are shutdown, subscribers are sent delete notifications as all of the application data is deleted. Many applications feed subscription notifications to a persistent database, and risk losing all of their data if the Data Server is shutdown without the subscriptions being closed, and the persistent database interface is not shutdown first. The new Datahub exits without sending these notifications.

The new Datahub also uses binary signatures to protect against data corruption. With the old versions, if you close a subscription in a procedure notification of the same subscription, or drop a table in a procedure notification on that table, the program will almost certainly exit with a segmentation violation. The new Datahub will take these changes in stride.

There have been other improvements that should not adversely affect existing applications. For example, the memory utilization of the new Datahub is substantially improved over the old versions:

Finally overall performance has been radically improved. The new Datahub uses real table management algorithms that scale as log N. The typical performance gain is 5-10X and it gets better and better for larger tables.

Document Version

Date of last revision: $Date: 2022/11/30 20:30:36 $

This document covers the Hume Integration Software developed Datahub that is available for the Tcl 8.5/Tk 8.5 environment on the Windows NT and UNIX/POSIX platforms.

Go to the Datahub Contents

Leave this document and go to the Tcl/Tk Resource Index