In order to access on-premise data sources, the SAP BTP cloud connector is needed for all data connections. However, for connections to some data sources, the SAP Analytics Cloud agent is also required. The SAP BTP cloud connector is a web server that has standard connections included. These connections will connect to applications such as SAP Business Planning and Consolidation standard, SAP Integrated Business Planning, and SAP S/4HANA.
If the SAP Analytics Cloud agent is deployed inside of an SAP BTP cloud connector, then connectors to additional data sources are available, such as SAP Business Warehouse, SAP HANA, or SAP ERP.
Using a standard connector means that the administrator for the on-premise data source creates the connection and provides you, the SAP Analytics Cloud administrator, with the required information to create the import data connection in SAP Analytics Cloud.
In this unit, we will cover two examples where additional configuration is required, SQL databases and file servers. For system-specific information on creating import data connections in SAP Analytics Cloud (includes both cloud applications and on-premise data sources), go to: Creating Import Data Connections | SAP Help Portal.
Scenario: SQL Database
Let's explore the process for creating import data connections to SQL databases, where there are some additional configuration steps required.
Creating import data connections to SQL databases can be broken down into three key steps:
- Install and configure SAP BTP cloud connector and SAP Analytics Cloud agent.
- Install and configure JDBC drivers for the appropriate SQL database.
- Create the import data connection in SAP Analytics Cloud.
As we covered the installation of the SAP BTP cloud connector and SAP Analytics Cloud agent earlier in the unit, we will start with step 2 of the process.
Prerequisites
A SQL Server authenticated user must be created. It is used by SAP Analytics Cloud when creating the import data connection.
Users must have Read or Maintain privileges on the Connection permission in SAP Analytics Cloud in order to view models and stories created from this connection.

Install and Configure JDBC Drivers for the SQL Database
This step is completed by the application administrator. Importing data from SQL databases is possible using JDBC connectivity.
The SAP Analytics Cloud Agent Simple Deployment Kit automates some of the configuration settings that must be done manually if there was a custom installation of the SAP BTP cloud connector and SAP Analytics Cloud agent. The settings for SQL database connections are an example of the settings that are preconfigured by the installation. The installation automatically creates the environment variable pointing to a sample .properties file that can modified. The JDBC drivers must be installed using the instructions in the Post-Setup Guide included in the kit.
The properties file specifies the database name (see the list below) and a path to the location of the JDBC driver for the database. The path must specify the complete path up to and including the properties file. Example path: C:\Program Files\SAP\SACAgentKit\config\c4a_agent_drivers.properties
Let's take a closer look at how the SQL database administrator configures the JDBC drivers for the SQL database.
The instructions below apply to a manual set up of the SAP Analytics Cloud agent.
- If a manual installation was performed, then create a properties file. Use either:
- Java option: -DSAP_CLOUD_AGENT_PROPERTIES_PATH
- An environment variable: SAP_CLOUD_AGENT_PROPERTIES_PATH
- Download the JDBC drivers for the required SQL databases and place the downloaded .jar file(s) in the required folder. For example: C:\Program Files\SAP\SACAgentKit\drivers\sybase.
- Modify the properties file by uncommenting (removing the #) on the lines of the databases that are required for the connection and enter the path to the driver as the property value. The properties file must include the correct file location of the .jar file. For example: #Sybase SQL Anywhere 16="path_to_JDBCdriver" becomes Sybase SQL Anywhere 16=C:\Program Files\SAP\SACAgentKit\drivers\sybase\jconn.jar in the .jar file.
If the driver requires more than one .jar file, then the paths can be separated by a semicolon.
- Restart the SAP Analytics Cloud agent, using either the Java option or the environment variable to specify the complete path up to and including the properties file.
- Java option: Restart the agent via the command line by navigating to the tomcat/bin directory and doing the following:
- Run the shutdown.bat or shutdown.sh script.
- Open the catalina.bat or catalina.sh file.
- Find the line where Java options are set. It should look similar to this: set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m"
- Modify this line so that the -DSAP_CLOUD_AGENT_PROPERTIES_PATH option is included and points to your properties file. For example:
set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m -DSAP_CLOUD_AGENT_PROPERTIES_PATH=C:\<path to driver config file>\DriverConfig.properties"
- Run the startup.bat or startup.sh script.
- Environment variable running Tomcat via command line: Shut down the existing Tomcat process, add the environment variable, and restart Tomcat from a new command line window to restart the agent.
- Environment variable running Tomcat as a Windows service: Use the Tomcat configuration manager to restart the agent.
- Java option: Restart the agent via the command line by navigating to the tomcat/bin directory and doing the following:

The names of databases in your properties file must EXACTLY match the names shown in the list below. If you change the name, then the SQL connection will fail.
- #Amazon EMR 5.6 (Hive 2.1)=
- #Amazon EMR Hive 0.11=
- #Amazon EMR Hive 0.13=
- #Amazon Redshift=
- #Apache Hadoop HIVE=
- #Apache Hadoop Hive 0.10=
- #Apache Hadoop Hive 0.12=
- #Apache Hadoop Hive 0.13 HiveServer2=
- #Apache Hadoop Hive 0.14 HiveServer2=
- #Apache Hadoop Hive 0.7=
- #Apache Hadoop Hive 0.8=
- #Apache Hadoop Hive 0.9=
- #Apache Hadoop Hive 0.x HiveServer1=
- #Apache Hadoop Hive 0.x HiveServer2=
- #Apache Hadoop Hive 1.0 HiveServer2=
- #Apache Hadoop Hive 1.x HiveServer2=
- #Apache Hadoop Hive 2.x HiveServer2=
- #Apache Spark 1.0=
- #Apache Spark 2.0=
- #BusinessObjects Data Federator Server XI R3=
- #BusinessObjects Data Federator Server XI R4=
- #Cloudera Impala 1.0=#Cloudera Impala 2.0=
- #DB2 10 for LUW=
- #DB2 10 for z/OS=
- #DB2 10.5 for LUW=
- #DB2 11 for LUW=
- #DB2 UDB v5=
- #DB2 UDB v6=
- #DB2 UDB v7=
- #DB2 UDB v8=
- #DB2 for z/OS v11=
- #DB2 for z/OS v12=
- #DB2 v9=
- #Data Federator Server=
- #Data Federator Server XI R3=
- #Data Federator Server XI R4=
- #Generic JDBC datasource=
- #GreenPlum 3=
- #GreenPlum 4=
- #HP Vertica 6.1=
- #HP Vertica 7.1=
- #HP Vertica 8=
- #Hortonworks Data Platform 2.3=
- #IBM Puredata (Netezza)=
- #IBM Puredata (Netezza) Server 7=
- #Informix Dynamic Server 10=
- #Informix Dynamic Server 11=
- #Informix Dynamic Server 12=
- #Ingres Database 10=
- #Ingres Database 9=
- #MS Parallel Data Warehouse=
- #MS SQL Server=
- #MS SQL Server 2000=
- #MS SQL Server 2005=
- #MS SQL Server 2008=
- #MS SQL Server 2012=
- #MS SQL Server 2014=
- #MS SQL Server 2016=
- #MS SQL Server 6.5=
- #MS SQL Server 7.x=
- #MaxDB 7.7=
- #MaxDB 7.9=
- #MySQL=
- #MySQL 5=
- #Netezza Server=
- #Netezza Server 4=
- #Netezza Server 5=
- #Netezza Server 6=
- #Netezza Server 7=
- #Oracle 10=
- #Oracle 11=
- #Oracle 12=
- #Oracle 12c Release 2=
- #Oracle 7.3=
- #Oracle 8=
- #Oracle 8.0=
- #Oracle 8.1=
- #Oracle 9=
- #Oracle Exadata=
- #Oracle Exadata 11=
- #Oracle Exadata 12=
- #PostgreSQL 8=
- #PostgreSQL 9=
- #Progress OpenEdge 10=
- #Progress OpenEdge 11=
- #Sybase ASIQ 12=
- #Sybase Adaptive Server 11=
- #Sybase Adaptive Server 12=
- #Sybase Adaptive Server 15=
- #Sybase Adaptive Server Enterprise 15=
- #Sybase Adaptive Server Enterprise 15.5=
- #Sybase Adaptive Server Enterprise 15.7=
- #Sybase Adaptive Server Enterprise 16.0=
- #Sybase IQ 15=#Sybase IQ 16=
- #Sybase SQL Anywhere 10=
- #Sybase SQL Anywhere 11=
- #Sybase SQL Anywhere 12=
- #Sybase SQL Anywhere 16=
- #Sybase SQL Anywhere 17=
- #Sybase SQLServer 11=
- #Teradata 12=
- #Teradata 13=
- #Teradata 14=
- #Teradata 15=
- #Teradata 16=
- #Teradata V2 R=
- #Teradata V2 R6=
Once the JDBC drivers for the SQL database are installed and configured, the SQL database administrator provides you with the Location. Server (host:port), Database name, User Name, and Password as you will need them when creating the connection in SAP Analytics Cloud.

Connection Creation
This step is completed by you, the SAP Analytics Cloud administrator. A new connection is created in SAP Analytics Cloud. Only the SQL databases that are configured in the properties file will be visible to you in the Connection Type.
Data modelers can then use this connection to import data from the on-premise data source by creating new import models in SAP Analytics Cloud.
We will cover the steps in detail in the practice exercise for this lesson, however, using a SQL database as an on-premise data source example, let's take a look at the summary of the process flow.
- In the side navigation menu, go to Connections.
- Select Add Connection and select SQL Databases from the Acquire Data options.
- In the Create New Connection dialog, select the Location of your SAP BTP cloud connector and Connection Type as provided by the SQL database administrator.
- In Connection Information, add a Name and Description for your connection.
- In Connection Details and Credentials, add the following information, as provided by the SQL database administrator:
- Enter the host name in the Server (host:port) field.
- Enter the Database.
- Enter the User Name used to connect to the SQL database.
- Enter the Password for the user.
If you want to share the credential details, select the option Share these credentials when sharing this connection. Otherwise, users will need to enter their own credentials in order to use the connection. If you don't share your credentials, users will be able to edit their credentials at any time without having to start a data acquisition process.
- Select Create. The new connection is added to the list of connections in the Connections area in SAP Analytics Cloud.
