Ajilius includes a command line scheduler that makes it easy to integrate complex ELT schedules into your operational environment (ie, CRON, Control-M, Windows Scheduler), as well as automated test frameworks and CI/CD tools like GOCD.

Here is a sample command line:

The section highlighted in blue must be present to start a Java program.

The orange section beginning with -w tells the scheduler which data warehouse you wish to use. This is the same name as shown on your Data Warehouses list screen.

The green section comprises one or more parameterised commands. This example tells the scheduler to perform and end-to-end process of all facts.

Because Ajilius is a dependency-based scheduler, this command would trigger and end-to-end rebuild of the data warehouse, loading and processing all dependencies necessary to update the fact tables.

Command Line Parameters

Parameters have a short- and long-name. Both short and long names are case sensitive, to allow for future expansion of the parameter list.

Parameters may be mixed in sequence, except that the -w command must always be present, and must always come first.

Being dependency based, keep in mind that the effect of parameters may be over-ridden by earlier parameters in the list. For example, a command like:

-f all -l all

would be redundant, as all tables covered by the load parameter (-l) may already have been processed as dependencies of the facts (-f). We say “may” because there may be load tables which have not yet been associated with facts or dimensions, and these would not be picked up by the fact parameter (-f). The load parameter (-l) would then load just these orphan tables.

The parameters which are recognised by the Ajilius command line scheduler are:

-w -warehouse

Identify the data warehouse against which the command line will be applied. The name used must exist as a Warehouse Name in your metadata database. Only one data warehouse may be updated by one command line.

This is a mandatory parameter, and must be the first parameter on the command line.

-n -notify

(This feature is yet to be delivered)

Ajilius can send notification emails on completion of a job.

Emails are sent using the SMTP server and credentials maintained in your ajilius.ini configuration file.

The options that are supported for the -notify parameter are:

all

This option will cause emails to be sent on all job completions, whether success or error. Without this option, Ajilius defaults to sending emails on error only.

email@address

A list of one or more email addresses can be listed, to which emails will be sent if required.

The following example -notify clause will send an email to a specific email address if an error occurs during batch processing:

-n dwsupport@mycompany.com

This example will send two separate emails, on both job success and failure:

-n all dwsupport@mycompany.com dwdev1@mycompany.com
-b -batch

Being a dependency-based scheduler, Ajilius keeps track of the jobs that have been run within a batch. This means that you can run multiple, independent commands to simultaneously load your data warehouse, with Ajilius tracking the dependencies between jobs. Also, in the case of error, Ajilius batches can be restarted without having to re-process earlier work.

If this parameter is specified, the only value currently supported is ‘reset’:

-b reset

This tells Ajilius to erase previous batch results, and start a new batch.

-l -load

This parameter supports extracting and loading one or more tables.

To extract all tables, use the value “all”:

-l all

To extract and load specific tables, list their name/s as the parameter values:

-l load_album load_artist
-s -source

You can load all tables from a nominated data source using this parameter.

This is useful for systems which generate large data sets that are loaded during an enterprise batch process. For example, as soon as the end-of-day processing has been completed for a system, any tables sourced from that system can be loaded.

To extract all sources, use the value “all”:

-s all

This is the same as the “-l all” load, except that tables will be loaded in order of source name followed by table name.

To load from a specific source, use the extract data source name/s as the parameter value:

-s ChinookOLTP

All tables from the nominated source/s will be loaded in alphabetic sequence of table name.

-t -transform

Transform one or more tables, and their dependencies where necessary.

The transform parameter will update the nominated stage table/s. If their precedent tables (stage and/or load tables) have not been processed, they will be automatically updated as part of this command.

To process all stage tables, use the “all” value:

-t all

To process one or more specific stage tables, list their names as the parameter values:

-t stage_sale stage_track

Due to the batch-aware state of Ajilius (see the -b parameter, above) precedent tables will not be reprocessed by this command.

-d -dimension

Process one or more dimensions, and their dependencies where necessary.

The dimension parameter will update the nominated dimension table/s. If their precedent stage tables and other dependencies have not been processed, they will be automatically updated as part of this command

To process all dimension tables, use the “all” value:

-d all

To process one or more specific dimension tables, list their names as the parameter values:

-d dim_calendar dim_location

Due to the batch-aware state of Ajilius (see the -b parameter, above) precedent tables will not be reprocessed by this command.

-f -fact

Process one or more facts, and their dependencies where necessary.

The fact parameter will update the nominated fact table/s. If their precedent dimensions, stage tables and other dependencies have not been processed, they will be automatically updated as part of this command

To process all fact tables, use the “all” value:

-f all

This is the simplest command to trigger reprocessing of an entire data warehouse.

To process one or more specific fact tables, list their names as the parameter values:

-f fact_sale fact_sale2

Due to the batch-aware state of Ajilius (see the -b parameter, above) precedent tables will not be reprocessed by this command.

-x -export

Export one or more tables to CSV files.

This command is very useful during the development and testing phase of the data warehouse project, to create extracts for external validation.

It may also be used to create regular exports for offline analytics.

To export one or more tables, list their names as parameter values:

-x fact_sale load_transaction

Note that the “all” parameter value is not supported for exports.

These tables will be exported as comma separated files, with header lines equivalent to the column name. The CSV files will be placed in the Extract Directory, specified on the Change Warehouse page. The file names will be table_name.csv