StreamAnalytix Settings
1. Go to Setup from left navigation pane and click on StreamAnalytix tab.
Provide values for following properties:FieldDescriptionStreamAnalytix Web URLConfigure StreamAnalytix Web URL as below.Zookeeper StreamAnalytix NodeZookeeper StreamAnalytix node is where Webstudio specific properties are managed.Zookeeper Configuration NodeZookeeper configuration node is where all the YAML properties are managed.Password Encryption Required Enable Password Encryption Required, to encrypt all password fields.Spark HomeSpark Home is the path to Spark Installation on machine where StreamAnalytix Studio is installed.Spark Job Submit ModeSpark Job Submit Mode is mode in which spark pipeline jobs are submitted. See Appendix-1 on deploying Livy and setting up Spark 2 client.The options are:• spark-submit• livy• job-serverHadoop UserHadoop User is the StreamAnalytix user through which pipeline will be uploaded to HDFS. Note: In case of Kerberos env. make sure the hadoop user mentioned here has valid principal and keytab.
Field
Description
StreamAnalytix Web URL
Configure StreamAnalytix Web URL as below.
Zookeeper StreamAnalytix Node
Zookeeper StreamAnalytix node is where Webstudio specific properties are managed.
Zookeeper Configuration Node
Zookeeper configuration node is where all the YAML properties are managed.
Password Encryption Required
Enable Password Encryption Required, to encrypt all password fields.
Spark Home
Spark Home is the path to Spark Installation on machine where StreamAnalytix Studio is installed.
Spark Job Submit Mode
Spark Job Submit Mode is mode in which spark pipeline jobs are submitted. See Appendix-1 on deploying Livy and setting up Spark 2 client.
The options are:
• spark-submit
• livy
• job-server
Hadoop User
Hadoop User is the StreamAnalytix user through which pipeline will be uploaded to HDFS.
Note: In case of Kerberos env. make sure the hadoop user mentioned here has valid principal and keytab.