Running the CommVault External Data Connector Scripts for Symantec NetBackup 7.5

I’m starting a new project to change the backup infrastructure for one of my customers. Currently the customer is using NetBackup 7.5 (installed on Windows) and we will incorporate the remote site in it’s centrally managed CommVault 10 environment to ease the handover to the operational teams and to efficiently use the capacity-based licenses.

To get a good understanding of the environment, including data sizings – we opted to execute the External Data Connector (EDC) Scripts for NetBackup 7.5 provided by CommVault Cloud Services.

Please note, you have the ability to install an “External Data Connector Agent” on one of your clients and use that one to feed the data from the NetBackup environment. However, I’m not in favour for this approach as it will pollute the CommServe Database. This blogpost describes how you can use the Cloud Applications of CommVault to stage the existing configuration without thrashing the production environment.

1/ First of all, let’s collect the Symantec NetBackup version by opening the “NetBackup Administration Console” and opening the “Help > About NetBackup Administration Console“.


2/ Logon onto CommVault Cloud and choose “Cloud Applications > External Data Connector“. Please note, you have the ability to create a temporary account in case you want the customer to execute these steps by itself.

3/ Verify if the NetBackup version you are running is supported by the CommVault EDC scripts. Upon confirmation, click on “Download“.

4/ Before guiding you through the wizard, let’s discuss the configuration parameters required for the External Data Connector script. The following things will be asked during the wizard.

The installed NetBackup version and the  installation directory of the NetBackup software can be found by quering the registry.

The NetBackup DataBase (NBDB) Server Name. NetBackup installs a Sybase SQL Anywhere during the Installation of the Master Server. The NetBackup database (NBDB) is also known as the Enterprise Media Manager (EMM) database. It contains information about volumes, and the robots and drives that are in NetBackup storage units. The same installation of Sybase SQL Anywhere is used for the optionally-licensed product, Bare Metal Restore database (BMRDB). The BMRDB database contains the information that the NetBackup Bare Metal Restore option manages. The BMRDB is created during the BMR installation process. The NetBackup Authorization DataBase (NBAZDB) is used to manage Access Control to the NetBackup software.

For collecting the database server name, open the “server.conf” file on the following location: “D:\Program Files\Veritas\NetBackupDB\CONF“.

Additionally, collect the server port number (“13785” should be the default value) used for the NetBackup DataBase (NBDB).

Update: I found out you can collect the server name as well from the registry (this is not by the book ;) ). Don’t forget to drop the “Veritas_” prefix!

5/ The NetBackup Master server is running on Windows. Unzip the package containing the scripts and right-click on “edc_nbu.bat” and choose “Run as administrator“.

7/ Okay, let’s get to work and fill in the parameters collected in step number 4. Once you initiated the data collection, the command prompt will automatically close. In my case, it took about 5 minutes to collect the data. Upon data collection, compress the data in a zip format -I called mine and copy it back to your workstation.

8/ If you did everything properly, the data collection should have been completed successfully. Open the CommVault Cloud website again and upload the zip file.


If you want to follow up on the request, open the CommVault Cloud website and navigate to “Cloud Appliations > My Requests“.

9/ A couple hours later, I received a mail including three reports: “SystemArchtectureRecommendation (HTML), JobSummaryReport (PDF) and LicenseSummaryReport (PDF)“.




The CommVault External Data Connector Tool gives a good overview of the AS-IS environment. However the data sizes needs to be interpreted with caution! It provides a brick-down overview of the systems and it’s respective data size.

Some graphs are used to illustrate the job statistics and an overview of the back-end data capacity which is good to get a first impression of the operational status.

The configuration (and backup history) incorporated a bunch of old decommissioned systems (for example the old NetApp storage). These gave a wrong impression of the total data size as it was added to the front end capacity even though the last full backup originated from 7 months ago. Eventually the current sizing requires 40TB less to be licensed.

The “SystemArchitectureRecommendation” reports provides a sizing already for both the CommServe, as the MediaAgent (including deduplication). Additionally a proposal for capacity based licenses is made. However, I’m not entirely sure it’s the best fit from a financial point of view.

I can certainly advise to use the tool in your design and/or sizing process. However, keep in mind a validation and some manual labour is still required!

Thanks for reading!

Leave a Reply