SQL Workbench/J implements a set of additional SQL commands that are executed directly by SQL Workbench/J (i.e. not by database server).
These commands can be used like any other SQL command (such as UPDATE
or SELECT
)
inside SQL Workbench/J, i.e. inside the editor or as part of a SQL script that is run through SQL Workbench/J
in batch mode.
As those commands are implemented by SQL Workbench/J you will not be able to
use them when running your SQL scripts using a different client program e.g.
psql
,
SQuirrel or SQL*Plus.
Creates an XML report of selected tables. This report could be used to generate an HTML documentation of the database (e.g. using the XSLT command). This report can also be generated from within the Database Object Explorer
The resulting XML file can be transformed into a HTML documentation of your database schema.
Sample stylesheets can be downloaded from http://www.sql-workbench.net/xstl.html.
If you have XSLT stylsheets that you would like to share, please send them to
<support@sql-workbench.net>
.
![]() | |
To see table and column comments with an Oracle database, you need to enable remarks reporting for the JDBC driver, otherwise the driver will not return comments. To see the "comment" values from SQL Server's extended properties, please setup the property retrieval as described here |
The command supports the following parameters:
Parameter | Description |
---|---|
-file | The filename of the output file. |
-objects |
A (comma separated) list of objects to report. Default is all objects that are "tables" or views. The list of possible objects corresponds to the objects shown in the "Objects" tab of the DbExplorer.
If you want to generate the report on tables from different schemas you have
to use fully qualified names in the list (e.g. This parameter supports auto-completion. |
-schemas |
A (comma separated) list of schemas to generate the report from.
For each user/schema all tables are included in the report. e.g.
If you combine The possible values for this parameter correspond to the "Schema" dropdown in the DbExplorer. The parameter supports auto-completion and will show a list of available schemas. |
-types |
A (comma separated) list of "table like" object types to include. By default
The default for this parameter is The values for this parameter correspond to the values shown in the "types" dropdown in the "Objects" tab of the DbExplorer. The parameter supports auto-completion and will show a list of the available object types for the current DBMS.
You can include any type shown in the DbExplorer's Objects tab. To e.g. include This parameter supports auto-completion. |
-excludeObjectNames |
A (comma separated) list of tables to exclude from reporting. This is only used if
-tables is also specified. To create a report on all tables, but exclude those that start
with 'DEV', use -tables=* -excludeTableNames=DEV*
|
-objectTypeNames |
This parameter can be repeated several times to define the object names per object type to be retrieved.
The format of the argument is
The following will select the tables -objectTypeNames='table:person,address' -objectTypeNames=sequence:t* -objectTypeNames=view:v_person
The type names are the same ones that can be used with the -objectTypeNames='table:cust.person,accounting.address' -objectTypeNames=view:public.*
This can also be used to restrict the retrieval of stored procedures:
If this parameter is used at least once, all of the following parameters
are ignored:
The exclusion pattern defined through |
-includeTables | Controls the output of table information for the report. The default is
true . Valid values are true , false .
|
-includeSequences |
Control the output of sequence information for the report. The default is
Adding |
-includeTableGrants | If tables are included in the output, the grants for each table can also be included with
this parameter. The default value is false .
|
-includeProcedures | Control the output of stored procedure information for the report. The default is
false . Valid values are true , false .
|
-includeTriggers |
This parameter controls if table triggers are added to the output.
The default value is true .
|
-reportTitle |
Defines the title for the generated XML file. The specified title is written
into the tag <report-title> and can be used when
transforming the XML e.g. into a HTML file.
|
-writeFullSource |
By default the sourcce code for views is written as retrieved from the DBMS into the XML file.
This might not be a complete
The default is |
-styleSheet | Apply a XSLT transformation to the generated XML file. |
-xsltOutput | The name of the generated output file when applying the XSLT transformation. |
-xsltParameter |
A list of parameters (key/value pairs) that should be passed to the XSLT processor. When using e.g. the wbreport2liquibase.xslt
stylesheet, the value of the author attribute can be set using -xsltParameter="authorName=42" . This parameter
can be provided multiple times for multiple parameters, e.g. when using wbreport2pg.xslt : -xsltParameter="makeLowerCase=42" -xsltParameter="useJdbcTypes=true"
|
The command WbGrepSource
can be used to search
in the source code of the specified database objects.
The command basically retrieves the source code for all selected objects and does a simple search on that source code. The source code that is searched is identical to the source code that is displayed in the "Source" tab in the various DbExplorer panels.
The search values can be regular expressions. When searching the source code the specified expression must be found somewhere in the source. The regex is not used to match the entire source.
The command supports the following parameters:
Parameter | Description |
---|---|
-searchValues |
A comma separated list of values to be searched for. |
-useRegex |
Valid values are
If this parameter is set to true, the values specified with
The default for this parameter is |
-matchAll |
Valid values are
This specifies if all values specified with
The default for this parameter is |
-ignoreCase |
Valid values are When set to true, the comparison is be done case-insesitive ("ARTHUR" will match "Arthur" or "arthur").
The default for this parameter is |
-types |
Specifies if the object types to be searched. The values for this
parameter are the same as in the "Type" drop down of DbExplorer's
table list. Additionally the types
When specifying a type that contains a space, the type name neeeds to be
enclosed in quotes, e.g.
The default for this parameter is This parameter supports auto-completion. |
-objects |
A list of object names to be searched. These names may contain SQL wildcards, e.g. This parameter supports auto-completion. |
-schemas |
Specifies a list of schemas to be searched (for DBMS that support schemas). If this parameter is not specified the current schema is searched. This parameter supports auto-completion. |
The functionality of the WbGrepSource
command is also
available through a GUI at
→
The command WbGrepData
can be used to search
for occurrences of a certain value in all columns of multiple tables.
It is the command line version of the (client side) Search Table Data tab
in the DbExplorer. A more detailed description on how the searching is performed is available that chapter.
![]() | |
To search the data of a table a SELECT * FROM the_table is executed and
processed on a row-by-row basis. Although SQL Workbench/J only keeps one row at a time in memory
it is possible that the JDBC drivers caches the full result set in memory. Please see the chapter
Common problems for your DBMS to check if the JDBC driver you are using
caches result sets.
|
The command supports the following parameters:
Parameter | Description |
---|---|
-searchValue |
The value to be searched for
This parameter is ignored when using |
-ignoreCase |
Valid values are When set to true, the comparison is be done case-insensitive ("ARTHUR" will match "Arthur" or "arthur").
The default for this parameter is |
-compareType |
Valid values are
When specifying
The default for this parameter is |
-tables |
A list of table names to be searched. These names may contain
SQL wildcards, e.g. This parameter supports auto-completion. |
-types |
By default This parameter supports auto-completion. |
-excludeTables |
A list of table names to be excluded from the search. If e.g. the wildcard for -tables would select too many tables, you can exclude individual tables with this parameter. The parameter values may include SQL wildcards.
|
-retrieveCLOB |
By default If the search value is not expected in columns of that type, excluding them from the search will speed up data retrieval (and thus the searching).
Only columns reported as |
-retrieveBLOB |
By default
If |
-treatBlobAs |
If this parameter specifies a valid encoding, binary (aka "BLOB") columns will be retrieved and converted to a character value using the specified encoding. That character value is then searched.
|
The following statement will search for the text Arthur
in all columns
and all rows of the table person
. It will find values foobar
,
somefoo
or notfoobar
:
WbGrepData -searchValue=foo -tables=person -ignoreCase=true
-ignoreCase=true
is the default behavior and can be omitted.
The following statement will search for the text foobar
in all columns
and all tables.
WbGrepData -searchValue=foobar -tables=*
The following statement will search for the text foo
in all columns
and all tables. It will match the value foobar
, but not barfoo
WbGrepData -searchValue=foo -compareType=startsWith -tables=*
The following statement will search for the text foo
in all columns
and all tables. It will only match the value foo
or FOO
but not somefoobar
WbGrepData -searchValue=foo -compareType=equals -tables=*
The following statement will search for any value where three characters are followed by two numbers.
It will match foo42
, bar12
WbGrepData -searchValue="[a-z]{2}[0-9]{2}" -compareType=contains -tables=person
As the column values are only tested if the regular expression is contained, not if
it is an exact match. The above search will also return foo999
.
To get an exact match using the contains
type, the regular expression needs to be anchored at the start and
the end. The following will only find only values that start with (exactly) two characters and are
followed by (exactly) two digits.
WbGrepData -searchValue="^[a-z]{2}[0-9]{2}$" -compareType=contains -tables=person
The following statement will return rows where any column either contains the value foo
or
the value bar
:
WbGrepData -searchValue="foo|bar" -compareType=contains -tables=person
As the column values are only tested if the regular expression is contained, not if
it is an exact match. The above search will also return foo999
.
For more information about regular expressions please visit: Regular-Expressions.info
This defines an internal variable which is used for variable substitution during SQL execution.
There are two possibilities to define a variable. The short syntax is: WbVarDef variable=value
The long syntax allows to variables in a different way:
Parameter | Description |
---|---|
-variable | The name of the variable to be defined. |
-value | The value of the variable. |
-file | Read the variable definitions from the specified file. |
-contentFile | Read the contents of the variable from a the specified file. |
-values | Define a comma separated list of values that are used in the dialog that is shown when prompting for variable values. |
More details and examples can be found in the chapter: Variable substitution
This removes an internal variable from the variable list. Details can be found in the chapter Variable substitution.
This list all defined variables from the variable list. Details can be found in the chapter Variable substitution.
The WbConfirm
command pauses the execution of the
current script and displays a message. You can then choose to stop
the script or continue.
WbConfirm
can be called in three different ways:
Without any parameter, then a default message will be displayed
With just a message text, e.g. WbConfirm Do you really want to drop everything?
Supplying parameters for the message, the text for the "Yes" choice and the text for the "No" choice using standard SQL Workbench/J parameters:
WbConfirm -message="Do you really want to drop everything?" -yesText="OK, go ahead" -noText="No, please stop"
When using WbConfirm
in console (or interactive batch) mode, the check if the "Yes" choice was
typed by the user is done by testing if the "Yes" value starts with the text the user enters (ignoring upper/lowercase
differences). So if the "Yes text" is set to "Continue"
, the user can enter c
,
co
, cont
and so on. Because of that, the "No" text should not start with
the same letters as the "Yes" text. When using -yesText=Continue and -noText=Cancel
and the user
enters C
, this would be regarded as a "Yes".
This command can be used to prevent accidental execution of a script even if confirm updates is not enabled.
This command has no effect in batch mode unless the -interactive
parameter was specified.
The WbMessages
command pauses the execution of the
current script and displays a message and waits until the dialog is closed.
Unlike WbConfirm
the script will always continue once the message dialog is closed.
WbMessage
can be called in two different ways:
With just a message text, e.g. WbMessage Done!
Supplying parameters for the message and the dialog title:
WbConfirm -message="Script finished" -title="SQL Script"
This command has no effect in batch or console mode.
If you want to run a stored procedure that has OUT
parameters, you have to use the WbCall
command to correctly see the returned value of the parameters.
Consider the following (Oracle) procedure:
CREATE OR REPLACE procedure return_answer(answer OUT integer) IS BEGIN answer := 42; END; /
To call this procedure you need to supply a placeholder indicating that a parameter is needed.
SQL> WbCall return_answer(?); PARAMETER | VALUE ----------+------ ANSWER | 42 (1 Row) Converted procedure call to JDBC syntax: {call return_answer(?)} Execution time: 0.453s SQL>
If the stored procedure has a REF CURSOR (as an output parameter), WbCall
will detect this, and retrieve the result of the ref cursors.
Consider the following (Oracle) stored procedure:
CREATE PROCEDURE ref_cursor_example(pid number, person_result out sys_refcursor, addr_result out sys_refcursor) is BEGIN OPEN person_result FOR SELECT * FROM person WHERE person_id = pid; OPEN addr_result FOR SELECT a.* FROM address a JOIN person p ON a.address_id = p.address_id WHERE p.person_id = pid; END; /
To call this procedure you use the same syntax as with a regular OUT parameter:
WbCall ref_cursor_example(42, ?, ?);
SQL Workbench/J will display two result tabs, one for each cursor returned by the procedure. If you use
WbCall ref_cursor_example(?, ?, ?)
you will be prompted to enter a
value for the first parameter (because that is an IN parameter).
When using ref cursors in PostgreSQL, normally such a function can simply be used inside a SELECT
statement, e.g. SELECT * FROM refcursorfunc();
. Unfortunately the PostgreSQL JDBC driver
does not handle this correctly and you will not see the result set returned by the function.
To display the result set returned by such a function, you have to use WbCall
as well
CREATE OR REPLACE FUNCTION refcursorfunc() RETURNS refcursor AS $$ DECLARE mycurs refcursor; BEGIN OPEN mycurs FOR SELECT * FROM PERSON; RETURN mycurs; END; $$ LANGUAGE plpgsql; /
You can call this function using
WbCall refcursorfunc();
This will then display the result from the SELECT inside the function.
With the WbInclude
command you run SQL scripts without
actually loading them into the editor, or call other scripts from within
a script. The format of the command is WbInclude -file=filename;
.
For DBMS other then MS SQL, the command can be abbreviated using the @ sign: @filename;
is equivalent to WbInclude -file=filename;
.
The called script way may also include other scripts. Relative filenames (e.g. as parameters
for SQL Workbench/J commands) in the script are always resolved to the directory
where the script is located, not the current directory of the application.
The reason for excluding MS SQL is, that when creating stored procedures in MS SQL, the procedure
parameters are identified using the @ sign, thus SQL Workbench/J would interpret the lines
with the variable definition as the WbInclude command. If you want to use the @ command
with MS SQL, you can configure this in your
workbench.settings
configuration file.
![]() | |
If the included SQL script contains |
The long version of the command accepts additional parameters. When using the long version, the filename needs to be passed as a parameter as well.
Only files up to a certain size will be read into memory. Files exceeding
that size will be processed statement by statement. In this case the automatic
detection of the alternate delimiter will
not work. If your scripts exceed the maximum size and you do use the alternate delimiter
you will have to use the long version of the command using the -file
and -delimiter
parameters.
The command supports the following parameters:
Parameter | Description |
---|---|
-file | The filename of the file to be included. |
-continueOnError |
Defines the behavior if an error occurs in one of the statements.
If this is set to true then script execution will continue
even if one statement fails. If set to false script execution
will be halted on the first error. The default value is false
|
-delimiter |
Specify a delimiter to be used that is different from the standard
A non-standard delimiter will be required to be on a line of its own.
If you specify select * from person / but putting the delimiter at the end of a line will not work: select * from person/
If this parameter is not specified, the SQL standard |
-encoding | Specify the encoding of the input file. If no encoding is specified, the default encoding for the current platform (operating system) is used. |
-verbose |
Controls the logging level of the executed commands.
-verbose=true has the same effect as adding a
WbFeedback on inside the called script.
-verbose=false has the same effect as adding
the statement WbFeedback off to the called script.
|
-displayResult |
By default any result set that is returned e.g. by a |
-printStatements |
If true, every SQL statement will be printed before execution. This is mainly intended for console usage, but works in the GUI as well. |
-showTiming |
If true, display the execution time of every SQL statement and the overall execution time of the script. |
-useSavepoint |
Control if each statement from the file should be guarded with a savepoint
when executing the script. Setting this to true will make
execution of the script more robust, but also slows down the processing
of the SQL statements.
|
-ignoreDropErrors | Controls if errors resulting from DROP statements should be treated as an error or as a warning. |
-searchFor |
Defines search and replace parameters to change the SQL statements before they are sent to the database. This can e.g. be used to replace the schema name in DDL script that uses fully qualified table names. The replacement is done without checking the syntax of the statements. If the search value is contained in a string literal or a SQL comment, it is also replaced. |
WbInclude
also supports conditional execution
Execute my_script.sql
@my_script.sql;
Execute my_script.sql
but abort on the first error
WbInclude -file="my_script.sql" -continueOnError=false;
Execute the script create_tables.sql
and change all occurances of oldschema
to new_schema
WbInclude -file=create_tables.sql -searchFor="oldschema." -replaceWith="new_schema."
Execute a large script that uses a non-standard statement delimiter:
WbInclude -file=insert_10million_rows.sql -delimiter='/';
If you manage your stored procedures in Liquibase ChangeLogs, you can use this command to run the necessary SQL directly from the XML file, without the need to copy and paste it into SQL Workbench/J. This is useful when testing and developing stored procedures that are managed by a Liquibase changeLog.
![]() | |
This is NOT a replacement for Liquibase.
It will not convert any of the Liquibase tags to "real" SQL.
It is merely a convenient way to extract and run SQL statements stored in a Liquibase XML file! |
The attribute splitStatements
for the sql
tag is evaluated. The delimiter used to split the statements follows the usual SQL Workbench/J rules (including the use
of the alternate delimiter).
WbRunLB
supports the following parameters:
Parameter | Description |
---|---|
-file |
The filename of the Liquibase changeLog (XML) file. The <include> tag is NOT supported! SQL statements stored in files
that are referenced using Liquibase's include tag will not be processed.
|
-changeSet |
A list of changeSet ids to be run. If this is omitted, then the SQL from all changesets (containing supported tags) are executed. The value
specified can include the value for the author and the id,
You can specify wildcards before or after the double colon:
If the parameter value does not contain the double colon it is assumed to be an ID only: If this parameter is omitted, all changesets are executed.
This parameter supports auto-completion if the |
-continueOnError |
Defines the behaviour if an error occurs in one of the statements.
If this is set to true then script execution will continue
even if one statement fails. If set to false script execution
will be halted on the first error. The default value is false
|
-encoding | Specify the encoding of the input file. If no encoding is specified, UTF-8 is used. |
To be able to directly edit data in the result set (grid) SQL Workbench/J needs
a primary key on the underlying table. In some cases these primary keys are not present or
cannot be retrieved from the database (e.g. when using updateable views).
To still be able to automatically update a result based on those tables (without always
manually defining the primary key) you can manually define a primary
key using the WbDefinePk
command.
Assuming you have an updateable view called v_person
where
the primary key is the column person_id
. When you simply do a
SELECT * FROM v_person
, SQL Workbench/J will prompt you for the
primary key when you try to save changes to the data. If you run
WbDefinePk v_person=person_id
before retrieving the result, SQL Workbench/J will automatically
use the person_id
as the primary key (just as if this
information had been retrieved from the database).
To delete a definition simply call the command with an empty column list:
WbDefinePk v_person=
If you want to define certain mappings permanently, this can be done using a mapping file that is specified in the configuration file. The file specified has to be a text file with each line containing one primary key definition in the same format as passed to this command. The global mapping will automatically be saved when you exit the application if a filename has been defined. If no file is defined, then all PK mappings that you define are lost when exiting the application (unless you explicitely save them using WbSavePkMap
v_person=person_id v_data=id1,id2
will define a primary key for the view v_person
and one for
the view v_data
. The definitions stored in that file can
be overwritten using the WbDefinePk
command, but those changes
won't be saved to the file. This file will be read for all database connections and
is not profile specific. If you have conflicting primary key definitions for
different databases, you'll need to execute the WbDefinePk
command
each time, rather then specifying the keys in the mapping file.
When you define the key columns for a table through the GUI, you have the option
to remember the defined mapping. If this option is checked, then that mapping
will be added to the global map (just as if you had executed WbDefinePk
manually.
![]() | |
The mappings will be stored with lowercase table names internally, regardless how you specify them. |
To view the currently defined primary keys, execute the command
WbListPkDef
.
To load the additional primary key definitions from a file, you can
use the the WbLoadPKMap
command. If a filename is defined
in the configuration file then that
file is loaded. Alternatively if no file is configured, or if you want to
load a different file, you can specify the filename using the -file
parameter.
To save the current primary key definitions to a file, you can
use the the WbSavePKMap
command. If a filename is defined
in the configuration file then the
definition is stored in that file. Alternatively if no file is configured, or if you want to
store the current mapping into a different file, you can specify the filename
using the -file
parameter.
The default fetch size for a connection can be defined in the connection profile. Using the
command WbFetchSize
you can change the fetch size without changing the connection profile.
The following script changes the default fetch size to 2500 rows and then runs a WbExport
command.
WbFetchSize 2500; WbExport -sourceTable=person -type=text -file=/temp/person.txt;
WbFetchSize
will not change the current connection profile.
To send several SQL Statements as a single "batch" to the database server, the two commands WbStartBatch and WbEndBatch can be used.
All statements between these two will be sent as a single statement (using executeBatch()
) to the server.
Note that not all JDBC drivers support batched statements, and the flexibility what kind of statements can be batched varies between the drivers as well. Most drivers will not accept different types of statements e.g. mixing DELETE and INSERT in the same batch.
To send a group of statements as a single batch, simply use the command WbStartBatch
to mark the beginning and
WbEndBatch
to mark the end. You have to run all statements together either by using "Execute all" or by selecting all
statements (including WbStartBatch and WbEndBatch) and then using "Execute selected". The following example sends all INSERT statements
as a single batch to the database server:
WbStartBatch; INSERT INTO person (id, firstname, lastname) VALUES (1, 'Arthur', 'Dent'); INSERT INTO person (id, firstname, lastname) VALUES (2, 'Ford', 'Prefect'); INSERT INTO person (id, firstname, lastname) VALUES (3, 'Zaphod', 'Beeblebrox'); INSERT INTO person (id, firstname, lastname) VALUES (4, 'Tricia', 'McMillian'); WbEndBatch; COMMIT;
To save the contents of a BLOB
or CLOB
column
into an external file the WbSelectBlob
command can be used. Most DBMS
support reading of CLOB
(character data) columns directly, so depending
on your DBMS (and JDBC driver) this command might only be needed for binary data.
The syntax is very similar to the regular SELECT
statement, an additional
INTO
keyword specifies the name of the external file into which the
data should be written:
WbSelectBlob blob_column INTO c:/temp/image.bmp FROM theTable WHERE id=42;
Even if you specify more then one column in the column list, SQL Workbench/J will only use the first column. If the SELECT returns more then one row, then one output file will be created for each row. Additional files will be created with a counter indicating the row number from the result. In the above example, image.bmp, image_1.bmp, image_3.bmp and so on, would be created.
WbSelectBlob
is intended for an ad-hoc retrieval of a single LOB column.
If you need to extract the contents of several LOB rows and columns it is recommended to
use the WbExport command.
You can also manipulate (save, view, upload) the contents of BLOB columns in a result set. Please refer to BLOB support for details.
Normally SQL Workbench/J prints the results for each statement
into the message panel. As this feedback can slow down the execution
of large scripts, you can disable the feedback using the WbFeedback
command. When WbFeedback OFF
is executed, only a summary of the
number of executed statements will be displayed, once the script execution has
finished. This is the same behaviour as selecting "Consolidate script log" in the
options window. The only difference is, that the setting through WbFeedback
is temporary and does not affect the global setting.
WbFeedback traceOn
can be used to enable printing of every executed statement
to the screen. The SQL statement printed will be the one after variable substitution and macro expansion.
WbFeedback traceOff
will turn tracing of statements off.
The SET
command is passed on directly to the driver,
except for the parameters described in this chapter because they
have an equivalent JDBC call which will be executed instead.
Oracle does not have a SQL set command. The SET command that is available in SQL*Plus is a specific SQL*Plus command and will not work with other client software. Most of the SQL*Plus SET commands only make sense within SQL*Plus (e.g. formatting of the results). To be able to run SQL scripts that are intended for Oracle SQL*PLus, any error reported from the SET command when running against an Oracle database will silently be ignored and only logged as a warning.
SET feedback ON/OFF
is equivalent to the WbFeedback
command, but mimics the syntax of Oracle's SQL*Plus utility.
With the command SET autocommit ON/OFF
autocommit can be turned on or
off for the current connection. This is equivalent to setting the autocommit property
in the connection profile or toggling
the state of the
→
menu item.
Limits the number of rows returned by the next statement. The behaviour of this command
is a bit different between the console mode and the GUI mode. In console mode, the maxrows
stay in effect until you explicitely change it back using SET maxrows
again.
In GUI mode, the maxrows setting is only in effect for the script currently being executed and will only temporarily overwrite any value entered in the "Max. Rows" field.
The following options for the SET command are only available when being connected to an Oracle database.
SET serveroutput on
is equivalent to the ENABLEOUT
command and SET serveroutput off
is equivalent to DISABLEOUT command.
This enables or disables the "autotrace" feature similar to the one in SQL*Plus. The syntax is equivalent to the SQL*Plus command and supports the following options:
Option | Description |
---|---|
ON |
Turns on autotrace mode. After running a statement, the statement result (if it is a query), the statistics and the execution plan for that statement are displayed as separate result tabs. |
OFF |
Turns off the autotrace mode. |
TRACEONLY |
Like |
REALPLAN |
This is an extension to the SQL*Plus
Using |
The information shown in autotrace mode can be controlled with two options after the ON
or TRACEONLY
parameter. STATISTICS
will fetch the statistics about the execution and EXPLAIN
which will display the execution plan for the statement. If not additional parameter is specified, EXPLAIN STATISTICS
is used.
If statistics are requested, query results will be fetched from the database server but they will not be displayed.
Unlike SQL*Plus, the keywords (AUTOTRACE, STATISTICS, EXPLAIN
) cannot be abbreviated!
For more information about the prerequisites for the autotrace mode, see the description of DBMS specific features.
In the connection profile two options can be specified to define the behavior when running commands that might change or update the database: a "read only" mode that ignores such commands and a "confirm all" mode, where you need to confirm any statement that might change the database.
These states can temporarily be changed without changing the profile using the WbMode
command.
![]() | |
This changes the mode for all editor tabs, not only for the one where you run the command. |
Parameters for the WbMode
command are:
Parameter | Description |
---|---|
reset |
Resets the flags to the profile's definition |
normal |
Makes all changes possible (turns off read only and confirmations) |
confirm |
Enables confirmation for all updating commands |
readonly |
Turns on the read only mode |
The following example will turn on read only mode for the current connection, so that any subsequent statement that updates the database will be ignored
WbMode readonly;
To change the current connection back to the settings from the profile use:
WbMode reset;
The command WbGenerateDrop
can be used to generate a SQL script for a table that
will drop all foreign keys referencing that table, then a DROP
statement for that table
and the statements to re-created the foreign keys referencing that table.
This is useful if you need to re-create a table but don't want to manually delete all referencing foreign keys, especially if the DBMS does not support a cascading DROP.
This is also available in the DbExplorer's context menu as "Generate DROP script".
The command supports the following parameters.
Parameter | Description |
---|---|
-tables |
A comma separated list of tables, e.g. |
-includeCreate |
Valid values:
By default |
-onlyForeignkeys |
Valid values:
When using |
-sortByType |
Valid values:
Usually the generated SQL script will be ordered by the type of statements. So first all statements
to drop constraints will be listed, then the drop table statements, then the statements to re-create all foreign keys.
When specifying |
-outputFile |
Defines the file into which all statements are written. If multiple tables are selected
using the |
-outputDir |
Specifies an output directory into which one script per selected table will be written.
The script files are named |
If neither -outputFile
nor -outputDir
is specified, the output
is written to the message panel.
The command WbGenerateDelete
can be used to generate a SQL script for one or more rows
that should be deleted including all rows from referencing tables (if foreign key constraints are defined)
This is also available through the menu item
→ which will generate the delete for the selected row(s) in the current result.The command supports the following parameters.
Parameter | Description |
---|---|
-table |
Specifies the root table of the hierarchy from which to delete the rows. |
-columnValue |
Defines the expression for each PK column to select the rows to be deleted. The value for this parameter is the column name followed by a colon, followed by the value for this column or an expression.
e.g.:
You can also specify expressions instead:
For a multi-column primary key, specify the parameter multiple times:
|
-includeCommit |
If |
-outputFile |
The file into which the generated statements should be written. If this is omitted, the statements are displayed in the message area. |
-appendFile |
Valid values:
If |
-formatSql |
Valid values:
If |
To generate a script that deletes the person with ID=42 and all rows referencing that person, use the following statement:
WbGenerateDelete -table=person -columnValue="id:42";
To generate a script that deletes any person with an ID greater than 10 and all rows referencing those rows, use the following statement:
WbGenerateDelete -table=person -columnValue="id: > 10";
To generate a script that deletes rows from the film_category
where the primary key consists of
the columns film_id
and category_id
:
WbGenerateDelete -table=person -columnValue="film_id: in (1,2,5)" -columnValue="category_id: in (7,3,5);
WbGenerateScript
re-creates the SQL for objects in the database.
It is the command line version of the Generate Script option in the
DbExplorer
The command supports the following parameters.
Parameter | Description |
---|---|
-objects |
A comma separated list of table (views or other objects), e.g. |
-exclude |
A comma separated list of object names to be excluded from the generated script.
The parameter supports wildcards |
-schemas |
A comma separated list of schemas. If this is not specified then the current (default) schema is used.
If this parameter is provided together with the The parameter supports auto-completion and will show a list of the available schemas. |
-types |
A comma separated list of object types e.g. The parameter supports auto-completion and will show a list of the available object types for the current DBMS. |
-file |
Defines the outputfile into which all statements are written. If this is not specified, the generated SQL statements are shown in the message area. file. |
-includeTriggers |
If this parameter is is present (or set to true), then all triggers (for the selected schemas) will be retrieved as well.
The default is |
-includeProcedures |
If this parameter is present (or set to true), then all procedures and functions (for the selected schemas) will be retrieved as well.
The default is |
-includeDrop |
If this parameter is present (or set to true) a |
-includeTableGrants |
This parameter controls the generation of table grants. The default value is true .
|
-useSeparator |
If this parameter is present (or set to true), comments will be added that identify the start and end
of each object. The default is |
WbGenerateImpTable
analyzes an import file and generates a suitable CREATE TABLE
statement to create a table with a structure that matches the import file, so that the file can be
imported into that table.
By default this command will only check the first 1000 lines of the input file, assuming that the values are distributed evenly. If the data types for the columns do not reflect the real data, the sample size needs to be increased.
The generated table definition is intended for a quick way to import the data and thus the column definitions are likely to be not completely correct or optimal.
The command supports the following parameters.
Parameter | Description |
---|---|
-file |
Specifies the input file to be analyzed. The filename may contain wildcards. When specifying
e.g.: |
-lines |
Defines the number of lines to analyze. The default is 1000 if this parameter is not specified A value of 0 (zero) or less, results in parsing the entire file. |
-type |
Possible values:
The type of the import file. The valid types are the same as for WbImport.
The To import spreadsheet files, the necessary additional libraries must be installed. |
-useVarchar |
Possible values:
If enabled, all columns will be created as
By default |
-delimiter | The delimiter for text files. |
-quoteChar | The quote character for text files. |
-encoding | The encoding for text files. |
-header | Specifies if the input file contains column headers. |
-dateFormat | The format for date columns. |
-timestampFormat | The format for timestamp columns in the input file. |
-decimal | The character used as the decimal separator. |
-outputFile |
By default the generated SQL statement is shown in the message area. If |
-sheetNumber |
If the input file is a spreadsheet, this parameter defines the sheet number to be analyzed.
The first sheet has the number
When specifying |
-table | The table name to use (or create) |
-runScript |
Possible values:
By default, the
By default, this will display a dialog to confirm the execution the |
Describe shows the definition of the given table. It can be abbreviated with DESC. The command expects the table name as a parameter. The output of the command will be several result tabs to show the table structure, indexes and triggers (if present). If the "described" object is a view, the message tab will additionally contain the view source (if available).
DESC person;
If you want to show the structure of a table from a different user, you need
to prefix the table name with the desired user DESCRIBE otheruser.person;
This command lists all available tables (including views and synonyms). This output is equivalent to the left part of the Database Object Explorer's Table tab.
You can limit the displayed objects by either specifying a wildcard for the
names to be retrieved: WbList P%
will list all tables or
views starting with the letter "P"
The command supports two parameters to specify the tables and objects to be listed. If you want to limit the result by specifying a wildcard for the name and the object type, you have to use the parameter switches:
Parameter | Description |
---|---|
-objects |
Select the objects to be returned using a wildcard name, e.g. |
-types |
Limit the result to specific object types, e.g. |
This command will list all indexes defined on tables available to the current user.
The command supports two parameters to specify the tables and objects to be listed. If you want to limit the result by specifying a wildcard for the name and the object type, you have to use the parameter switches:
Parameter | Description |
---|---|
-schema |
Show only indexes for the specified schema, e.g. |
-catalog |
Show only indexes for the specified catalog e.g. |
-tableName |
Show only indexes for the tables specified by the parameter. The parameter value
can contain a wildcard, e.g. |
-indexName |
Show only indexes with the specified name. The parameter value
can contain a wildcard, e.g. |
This command will list all stored procedures available to the current user. The output of this command is equivalent to the Database Explorer's Procedure tab.
You can limit the list by supplying a wildcard search for the name, e.g.:
WbListProcs public.p%
This command will list all stored triggers available to the current user. The output of this command is equivalent to the Database Explorer's Triggers tab (if enabled)
Lists the available catalogs (or databases). It is the same information that is shown in the DbExplorer's "Database" drop down.
The output of this command depends on the underlying JDBC driver and DBMS.
For MS SQL Server this lists the available databases (which then could be changed
with the command USE <dbname>
)
For Oracle this command returns nothing as Oracle does not implement the concept of catalogs.
This command calls the JDBC driver's getCatalogs()
method and will
return its result. If on your database system this command does not display
a list, it is most likely that your DBMS does not support catalogs (e.g. Oracle)
or the driver does not implement this feature.
This command ignores the filter defined for catalogs in the connection profile and always returns all databases.
Lists the available schemas from the current connection. The output of this command depends on the underlying JDBC driver and DBMS. It is the same information that is shown in the DbExplorer's "Schema" drop down.
This command ignores the filter defined for schemas in the connection profile and always returns all schemas.
This command will show the source for a single table. The name of the table is given as an argument to the command:
WbTableSource person
This command will show the source for a single view. The name of the view is given as an argument to the command:
WbViewSource v_current_orders
This command will show the source for a single stored procedure (if the current DBMS is supported by SQL Workbench/J). The name of the procedure is given as an argument to the command:
WbProcSource theAnswer
This command retrieves the row counts for several tables at once. If called without parameters the row counts for all tables accessible to the current user are counted.
The command supports the following parameters to specify the tables (or views) to be counted.
Parameter | Description |
---|---|
-schema |
Count the rows for tables from the given schemas, e.g. The parameter supports auto-completion and will show a list of available schemas. |
-catalog |
Show only indexes for the specified catalog e.g. |
-objects |
Show only the row counts for the tables (or views) specified by the parameter. The parameter value
can contain wildcards, e.g. The parameter supports auto-completion and will show a list of available tables. |
-types |
Define the types of objects which should be selected. By default only tables are considered.
If you also want to count the rows for views, use The parameter supports auto-completion and will show a list of available object types. |
-orderBy |
Defines how the resulting table should be sorted. By default it will be sorted
alphabetically by table name. The
So sort by multiple columns separate the column names with a comma: |
-excludeColumns |
Possible values:
By default
You can specify a comma separated list of columns to be excluded, e.g.
The name |
If none of the above parameters are used, WbRowCount
assumes that a list ot table names was specified.
WbRowCount person,address,orders
is equivalent to WbRowCount -objects=person,address,orders
. When
called without any parameters the row counts for all tables accessible to the current user will be displayed.
Unlike the Count rows item in the DbExplorer, WbRowCount
displays the result for all tables once it is finished. It does not incrementally update the output.
With the WbConnect
command, the connection for the currently running script can be changed.
When this command is run in GUI mode, the connection is only
changed for the remainder of the script execution. Therefor at least one other statement should be
executed together with the WbConnect
command. Either by running
the complete script of the editor or selecting the WbConnect
command
together with other statements. Once the script has finished, the connection is closed
and the "global" connection (selected in the connect dialog) is active again. This also applies
to scripts that are run in batch mode or
scripts that are started from within the console using
WbInclude
.
When this command is entered directly in the command line of the
console mode, the current connection is closed and the
new connection is kept open until the application ends, or a new connection is established
using WbConnect
on the command line again.
There are three different ways to specify a connection:
Parameter | Description |
---|---|
-profile |
Specifies the profile name to connect to.
This parameter is ignored if either |
-profileGroup | Specifies the group in which the profile is stored. This is only required if the profile name is not unique |
Parameter | Description |
---|---|
-connection |
Allows to specify a full connection definition as a single parameter (and thus does not require a pre-defined connection profile). The connection is specified with a comma separated list of key value pairs:
e.g.: If an appropriate driver is already configured the driver's classname or the JAR file don't have to be specified. If an appropriate driver is not configured, the driver's jar file must be specified:
SQL Workbench/J will try to detect the driver's classname automatically (based on the JDBC URL).
If this parameter is specified,
The individual parameters controlling the connection behavior
can be used together with |
Parameter | Description |
---|---|
-url | The JDBC connection URL |
-username | Specify the username for the DBMS |
-password |
Specify the password for the user
If this parameter is not specified (but |
-driver | Specify the full class name of the JDBC driver |
-driverJar | Specify the full pathname to the .jar file containing the JDBC driver |
-autocommit | Set the autocommit property for this connection. You can also
control the autocommit mode from within your script by using the
SET AUTOCOMMIT
command.
|
-rollbackOnDisconnect | If this parameter is set to true, a
ROLLBACK will
be sent to the DBMS before the connection is closed. This setting is
also available in the connection profile.
|
-checkUncommitted | If this parameter is set to true, SQL Workbench/J will try to detect uncommitted changes in the current transaction when the main window (or an editor panel) is closed. If the DBMS does not support this, this argument is ignored. It also has no effect when running in batch or console mode. |
-trimCharData |
Turns on right-trimming of values retrieved from
CHAR
columns. See the
description of the
profile properties for details.
|
-removeComments | This parameter corresponds to the Remove comments setting of the connection profile. |
-fetchSize | This parameter corresponds to the Fetch size setting of the connection profile. |
-ignoreDropError | This parameter corresponds to the Ignore DROP errors setting of the connection profile. |
-altDelimiter | This parameter corresponds to the Alternate delimiter setting of the connection profile. |
If none of the parameters is supplied when running the command, it is assumed that any value
after WbConnect
is the name of a connection profile, e.g.:
WbConnect production
will connect using the profile name production
, and is equivalent to
WbConnect -profile=production
This command is primarily intended for console mode to show the statements that have been executed. In console mode the number of any of the listed statements can be entered to re-execute that statement from the history directly.
Transforms an XML file via a XSLT stylesheet. This can be used to format XML input files into the correct format for SQL Workbench/J or to transform the output files that are generated by the various SQL Workbench/J commands.
Parameters for the XSLT command:
Parameter | Description |
---|---|
-inputfile | The name of the XML source file. |
-xsltoutput | The name of the generated output file. |
-stylesheet | The name of the XSLT stylesheet to be used. |
-xsltParameter |
A list of parameters (key/value pairs) that should be passed to the XSLT processor. When using e.g. the wbreport2liquibase.xslt
stylesheet, the value of the author attribute can be set using -xsltParameter="authorName=42" . This parameter
can be provided multiple times for multiple parameters, e.g. when using wbreport2pg.xslt : -xsltParameter="makeLowerCase=42" -xsltParameter="useJdbcTypes=true"
|
To run an operating system command use WbSysExec
followed by a valid command for your operating system.
To run the program ls
the following call can be used:
WbSysExec ls
To run Windows commands that are internal to cmd.exe
such as DIR
, you
must call cmd.exe
with the /c
switch to make sure cmd.exe is terminated:
WbSysExec cmd /c dir /n
If you need to specify a working directory for the program, or want to specify the command line arguments individually, a second format is available using the standard SQL Workbench/J parameter handling:
Parameter | Description |
---|---|
-program | The name of the executable program |
-argument | One commandline argument for the program. This parameter can be repeated multiple times. |
-dir | The working directory to be used when calling the external program |
WbSysExec
also supports conditional execution
To run an internal Windows command using the second format, use the following syntax:
WbSysExec -program='cmd.exe' -argument='/c' -argument='dir /n' -dir='c:\temp\'
WbSyOpen
can be used to open a file with the default application of the operating system.
WbExport -file=c:/temp/person.txt -sourceTable=person -type=text -header=true; WbSysOpen c:/temp/person.txt;
![]() | |
Due to limitations of the Java console mode, neither |
To turn on support for Oracle's DBMS_OUTPUT
package you have to use the
(SQL Workbench/J specific) command ENABLEOUT
. As an alternative you can
also use the SQL*Plus command set serveroutput on
. In contrast to
SQL*PLus set serveroutput on
must be terminated with a semicolon (or the alternate
delimiter).
After running ENABLEOUT
the DBMS_OUTPUT
package is enabled,
and any message written with dbms_output.put_line()
is displayed in the message
panel after executing a SQL statement. It is equivalent to calling the dbms_output.enable() procedure.
You can control the buffer size of the DBMS_OUTPUT
package by passing the
desired buffer size as a parameter to the ENABLEOUT
command:
ENABLEOUT 32000;
![]() | |
Due to a bug in Oracle's JDBC driver, you cannot retrieve columns with
the |
To disable the DBMS_OUTPUT
package again, use the (SQL Workbench/J specific)
command DISABLEOUT
. This is equivalent to calling
dbms_output.disable()
procedure.
ENABLEOUT
and DISABLEOUT
support an additional parameter quiet
to supress the feedback message that the support for DBMS_OUTPUT
has been enabled or disabled.
Defines a new macro (or overwrites an existing one). This command is primarily intended for the console mode
Parameter | Description |
---|---|
-name | The name of the new macro |
-group | The name of the macro group in which the new macro should be stored |
-text | The text of the macro |
-file | A file from which to read the macro text. If this parameter is supplied, -text is ignored |
-encoding | The encoding of the input file specified with the -file parameter. |
-expand | If true then the new macro is a macro that is expanded while typing |
Display the defined macros. This command is primarily intended for the console mode.
The command WbEcho
can be used to print messages. The following statement:
WbEcho The answer is: 42
will print the text "The answer is: 42" to the message pane in GUI mode, or to the console in batch or console mode.
The following SQL Workbench/J commands support conditional execution based on variables:
Conditional execution is controlled using the following parameters:
Parameter | Description |
---|---|
-ifDefined |
The command is only executed if the variable with the specified name is defined.
|
-ifNotDefined |
The command is only executed if the variable with the specified name is defined.
|
-ifEquals |
The command is only executed if the specified variable has a specific value
|
-ifNotEquals |
The command is only executed if the specified variable has a specific value
|
-ifEmpty |
The command is only executed if the specified variable is defined but has an empty value
|
-ifNotEmpty |
The command is only executed if the specified variable is defined and has a a non empty value
|
Not all configuration parameters are available through the Options Dialog and have to be changed manually in the file workbench.settings. Editing the file requires to close the application.
When using WbSetConfig
configuration properties can be changed permanently without restarting SQL Workbench/J.
Any value that is changed through this command will be saved automatically in workbench.settings
when the application is closed.
If you want to e.g. disable the use of Savepoints in the SQL statements entered interactively, the following command will turn this off for PostgreSQL:
WbSetConfig workbench.db.postgresql.sql.usesavepoint=false
For a list of configuration properties that can be changed, please refer to Advanced configuration options
If you supply only the property key, the current value will be displayed. If no argument is supplied for WbSetConfig
all
properties are displayed. You can also supply a partial property key. WbSetConfig workbench.db.postgresql
will list
all PostgreSQL related properties. You can directly edit the properties in the result set.
The value [dbid]
inside the property name will get replaced with the current
DBID.
The following command changes the property named workbench.db.postgresql.ddlneedscommit
if the current connection is against a PostgreSQL database:
WbSetConfig workbench.db.[dbid].ddlneedscommit=true