DB2PRO System: Subscription-Based DB2 LUW Monitoring Tool

Subscription-Based DB2 Monitoring Tool

A Typical DB2 LUW monitoring tool such as Optim Performance Manager is a licensed product that you purchase and install inside your network on a newly aquired hardware which is an additional cost to you.  Subsequently, you will require resources to configure and maintain the installation.  And most certainly, you are not able to make changes to the tool - that is you cannot create your own charts and graphs if you sign up for a Business plan.

Our DB2 Monitoring tool (the DB2PRo System) is designed to take a completely different approach.   We have created a subscription based DB2 Monitoring Tool that runs in the AWS Cloud - so that you won't need to buy hardware, software or consume any of you internal human resources.  You simply outsource all of that to us via a risk-free subscription.  In addition, you are able to change and customize the tool to meet specific requirements that we did not think of.

Our Plans

Free Plan

This plan is ideal for you to kick the tires but you only get 100 MB/month upload allowance.

Personal ($39.95/Month) Plan

This plan is ideal for DB2 LUW DBAs who like to monitor a few databases.  You will have an upload quota of 500 MB/month - that should be good for 5 databases.

Premium ($49.95/Month) Plan

This plan is ideal for DB2 LUW DBAs who like to make sure they have enough space for more databases as you get an allowance of 1 GB/month.

Business in AWS Cloud Plan ($995 Initial Install + $300+/month)

Ideal for businesses that want to completely offload this work to us.  We will configure a dedicated AWS server that would have enough capacity to support your needs.  Subsequently we will help you setup automatic data feed to your dedicated server in the cloud.

**We offer a 1-month free trial on this plan - running on t2.small server**

Business in-house ($7500 Initial Install + $? Support/month)

Ideal for businesses that want to have the solution built in-house due to policy or security reasons.  With this plan you will need a support contract with DB2PRO if you need to use our services to maintain the System for you.  Alternatively you can receive training on how the system works and have your internal staff customize and maintain it on an on-going basis.

Please consult our price page for feature comparisons of all these plans.

How to get started with the Free, Personal or Premium Plans

It's very easy, you simply register an account on our site and you specify whether you like to use a Free tier, Personal or Premium.  If you opt in for a free tier, you will never receive an invoice from us, but we will stop processing your files once you reach your quota.  Your uploaded files will still be there should you decide to upgrade your plan and have them processed.

If you choose a Personal or Premium plan, you will receive a PayPal invoice at the end of your monthly usage.

You can upgrade or downgrade your plans with a 1-month notice.

Please note that these three plans only have manual upload capabilities through our  website (File Upload page).

How to get started with a Business Plan in AWS Cloud

Step-1: Free Consultation

We start by a conference call.  If you like to take advantage of the free trial, we will then discuss how many databases you should monitor during the free trial phase.  The free trial will run on a t2.small server in AWS (2 GB of memory, 1 CPU) and we provision up to 128 GB of disk space.  At the end of the free trial and if you decide to continue, we will port your installation to a bigger server.

Step 2: New AWS Server and New AWS Private Key

Each business client will receive a separate AWS Server with it's own Amazon compatible key pairs.  We can also configure the AWS server to be accessible from certain IP addresses for additional security.

Step-3: We build a new DB2PRO System for you

The process of building a DB2PRO system will take several hours on our side and we include this as part of the free trial.  Depending on resource availability, your server should be up and running within one or two business days.

Step-4: Client Setup

The client piece of the solution runs inside your network and is coded in simple Unix shell scripts.  You can look at the content of the scripts on our download page - and if you have any questions we are here to explain.  These scripts will need to run on your side to collect stats on regular intervals and send them to your AWS server.

Step-5: Start your engine

You start the data collection/shipping scripts on your side and we start the data processing engine on ours.  We will do some verification to iron out any issues and then we let the system run.  At this point, your DB2PRO System is ready, we will provide you with a userid/password and you can access the system at:

https://<yourcompany>.db2pro.com

Few words about data collection strategy

We offer two types of scripting.  The pure SSH and Collection via DB2 Connection.  The pure SSH requires a private/public key between the database servers and the server where you run the scripts.  The DB2 Connection approach requires a DB2 Client with all the catalog entries to the databases you like to collect stats from.  The following table highlights some of the key features, and the strength and weakness of each approach.

SSHDB2 Connection
What is the difficulty to setup? Moderate Simple
How do the scripts run against the DB2 Database? Locally Remotely
Which databases get included for monitoring? All active databases on the server Only the hand-picked ones in DatabaseList.ini
Could it be intrusive? No Maybe, if the database is down, it will be brought up with the connection
Do you need a password file? No Yes

DB2 Connection Approach

Since this is a very simple approach, we have provided the instructions in the download page.

SSH Approach

Here is the concept of the SSH Approach, but since it's a bit more involved, we encourage you to contact us for the source code and instructions.

The following section describes how our SSH Approach works.

Data Collection Agents (Client Side)

All the data collection scripts come in controller/agent pair.  A controller script takes care of self scheduling and dispatching of the agent script, and the agent script is responsible to perform the actual work on the remote server.   The agent script will subsequently drop files on the remote servers for the shipping agent to pick up at a later time.   This allows for the data collection and data shipping to work independently, if one goes down, the other can continue with its work.

The key characteristics of the data collection scripts are as follows:

Shell Scripts

Our data collection scripts are done via shell scripts.  We choose shell scripts so that our clients can see exactly what is being collected and in fact edit the script to turn on/off certain modules in the scripts.   By default, the scripts collect data from the system catalog and administrative views - but that can be adjusted to collect almost anything.

Self-Scheduling Controller

As mentioned earlier, our controller scripts are self scheduling.  You usually end up with one independent hourly, daily and weekly controller script.  We have opted for this approach to keep things simple and remove any dependencies on external scheduling software.  We will provide you instructions on how to start these scripts in background as a daemon and how to stop them.

Active Databases ONLY

Our collection agent scripts are designed to only connect to an active database for data collection.  This is by design as we don't like to activate a database for the purpose of stats collection.  We strongly recommend to keep this setting, but of course you can change it if you wish.

Password-Less

Our controller scripts use a ssh tunnel to gain access to remote servers for dispatching agents.  And the agents will run as a local process against the DB2 database - which allows them to omit providing userid and password.  This type of scripting eliminates any need to store userid and passwords anywhere - we simply don't need them for the scripts.

Furthermore, our scripts are designed to automatically collect from any active database under a DB2 Instance.  This means if you create a new database and activate it, it will get picked up by the db2pro scripts.

Centralized

Our scripts only need to be installed on one server.  The controller script will ship the agent script to the remote server first and then will execute it.  This means that the same copy of the agent script will be executed everywhere.  As well it means if there are exceptional situations out there, we need code them in the agent script instead of having a different copy of the agent script on the different servers.

IXF

The collection agents create their files in IXF format (a DB2 Export file), and the file names are as such that the processing agent can identify the origin of the file. Subsequently, the agent will gather all the files it created during its execution and create one compressed tarball package which will be shipped to the processing side at a later stage (Data Shipping Agents).

Each client is unique?

We don't collect everything by default - that would certainly cause an overload on your systems.  We always start by collecting the very basic and work our way up.  For us, each client has a unique need, that's why the data collection scripts for each client ends up being unique to their needs.   Fortunately, our data processing engine is designed to handle everything that you turn on in the data collection scripts.

Data Shipping Agents (Client Side)

The shipping scripts are also self scheduling and come in controller/agent pair.  In general, we have to types of data shipping agents:

Agent responsible to bring the files to the centralized server 

This agent is executed on the remote servers via the pre-established ssh tunnel.  During its remote execution, it figures out what compressed tarball files need to be shipped to the centralized server and will do that over the ssh tunnel.  Once the files are on the centralized server, it will remove them from the remote server and it ends its remote execution.

Agent responsible for shipping the files to the cloud

This agent runs from the centralized server and is responsible for shipping the compressed tarball files to the cloud server.  Once the files are sent, it will delete them from the centralized server.

To some clients, this is very straight forward but to some it's confusing.  Nonetheless, we will be more than happy to discuss the db2pro system during your free consultation.

DB2PRO System - Server Side Processing

The server side processing takes place in your AWS server.  We ensure

  1. The vital signs for your DB2PRO System are all good
  2. Files are flowing in properly and are being processed
  3. Your DB2PRO website is kept up to date

Contact us today