Essential software for data transformation, management, and compliance.

HPE Shadowbase Essentials Bundle

A Diverse Set of Data Management Utilities

HPE Shadowbase Essentials is a software package that includes several solutions for data transformation, management, compliance, monitoring, and controlling.

This package is sold separately from the main change data capture (CDC) replication engine.

Shutterstock

HPE Shadowbase Audit Compliance Utilities

Know and Record Your Transactions

HPE Shadowbase Audit Log (SAL)[1]Create and fill an archival, searchable database with the data changes made to the application database.

Shadowbase Audit Log is available for the HPE NonStop Server platform as a source, and creates a searchable archival database of transactional activity (e.g., inserts, updates, and deletes) on a reporting database for application change data auditing purposes.

HPE Shadowbase Audit Reader (SAR)[1]Show what and when application transactions occurred.

Shadowbase Audit Reader analyzes and displays all TMF-audited database activities contained in the TMF audit trails on HPE NonStop server systems. Using an SQL-like query syntax, it shows what application transactions did, and when they did it, to database files and tables, enabling investigation of how and when data (and the database) has been changed.

"That SBDDLUTL tool was a Godsend, very useful."

-Large Travel Company Technologist

Shutterstock/Darrin Loeliger

HPE Shadowbase Data Definition Language

Convert and Map Enscribe Data Structures into SQL equivalents (SBDDLUTL)[1]

The HPE Shadowbase Data Definition Language Utility (SBDDLUTL) provides a powerful interface to convert and map the Enscribe DDL data structures (records, fields) into their SQL equivalents (tables, columns), producing an editable CREATE TABLE statement for the target SQL environment.

SBDDLUTL supports a variety of target SQL databases, including NonStop SQL/MP, NonStop SQL/MX, Oracle, Microsoft SQL Server, IBM Db2®, Oracle MySQL, SAP HANA, and SAP Sybase. It is a key utility for customers performing Shadowbase Data Integration replication from NonStop source Enscribe files out to SQL target tables.

Without SBDDLUTL, the conversion work will have to be done by hand, which is a time consuming and error-prone process.

Convert and Map SQL/MP Schemas into Target SQL Equivalents (SBCREATP)[1]

The HPE Shadowbase SQL/MP Schema Conversion Utility (SBCREATP) converts and maps SQL/MP table schema data structure definitions (columns, data types) into target SQL equivalents (columns, data types), and produces an editable CREATE TABLE statement for the target SQL environment.

SBCREATP supports a variety of target SQL databases, including NonStop SQL/MX, Oracle, Microsoft SQL Server, IBM Db2®, Oracle MySQL, PostgreSQL, SAP HANA, and SAP Sybase. It is a key utility for customers performing Shadowbase data integration replication from NonStop source SQL/MP tables out to heterogeneous SQL target tables.

Without SBCREATP, the conversion work will have to be done by hand, which is a time-consuming and error-prone process.

Getty Images

HPE Shadowbase Map

Flexibly Transform Data

  • Create a data mapping and transformation stream between a source and target database and/or application
  • HPE SBMAP[1] gives the user the capability of defining and applying data mapping in replication without the need to write C, C++, or COBOL code to implement custom Shadowbase User Exits
Miles Holenstein

HPE Shadowbase Data Recovery Utilities

Restore Corrupted Databases

HPE Shadowbase UNDO[1] Selectively undo, or rollback, changes made to a database

Shadowbase UNDO restores a corrupted database by undoing corrupting database changes while retaining correct database changes, thus leaving the database in a known, consistent, and current state. Shadowbase UNDO allows the database to stay online for application processing while the rollback of the corrupted data occurs.

HPE Shadowbase REDO[1]Roll database changes forward to a current and correct state

Shadowbase REDO maintains a Redo Queue of all changes that were made to a database. If the database subsequently becomes corrupted, Shadowbase REDO can be used to roll forward these changes against a restored copy of the database by deleting corrupting changes from the Redo Queue and applying only the valid database changes to roll forward to a current and correct state.

Shutterstock

HPE Shadowbase DDL Command Replication (DCR)

Automatically Replicate and Apply DDL Commands

Shadowbase DCR[1] automatically replicates and applies SQL/MP SQLCI (DDL) source commands to the target database. It automatically adjusts the source command to match the target environment’s details (e.g., file / table name mapping).

SQLCI DDL commands integrate with HPE Shadowbase (DML) real-time replication. It is particularly helpful to customers migrating from RDF / SDR to HPE Shadowbase.

Shutterstock/Miles Holenstein

HPE Shadowbase Extract, Transform, and Load (ETL)

Extract Database Changes or Initial Load Data into Flat Files

The HPE Shadowbase Extract, Transform, and Load Toolkit (ETL)[1] is used to extract database changes or initial load data from a Shadowbase source database or environment into flat files for subsequent ETL (or Extract, Load, and Transform, or ELT) loading into a target environment such as a data warehouse. HPE Shadowbase Application Integration provides a suite of adapters to easily integrate the source data with the target environment.  Available adapters include:

  • Integrating with KAFKA
  • Integrating with IBM MQ environments
  • JSON output format
  • Flat-files (CSV, fixed position, tab-delimited)

For certain applications, an existing vendor’s ETL (or ELT) loading utility can then be used to load the Shadowbase ETL flat file(s) into the target environment.

1Pre-requisite Solutions

To use this solution, HPE Shadowbase BC Basic or HPE Shadowbase DIAI Basic must be licensed for the system.


Shutterstock

Utilities

HPE Shadowbase Utilities are separate from the Essentials Bundle and include additional tools for various uses, including managing, monitoring, and correcting data.