22/12/13 08:53:12 INFO DriverDaemon$: Started Log4j2 22/12/13 08:53:15 INFO DriverDaemon$: Current JVM Version 1.8.0_345 22/12/13 08:53:15 INFO DriverDaemon$: ========== driver starting up ========== 22/12/13 08:53:15 INFO DriverDaemon$: Java: Azul Systems, Inc. 1.8.0_345 22/12/13 08:53:15 INFO DriverDaemon$: OS: Linux/amd64 5.4.0-1090-azure 22/12/13 08:53:15 INFO DriverDaemon$: CWD: /databricks/driver 22/12/13 08:53:15 INFO DriverDaemon$: Mem: Max: 2.1G loaded GCs: PS Scavenge, PS MarkSweep 22/12/13 08:53:15 INFO DriverDaemon$: Logging multibyte characters: ✓ 22/12/13 08:53:15 INFO DriverDaemon$: 'publicFile.rolling.rewrite' appender in root logger: class org.apache.logging.log4j.core.appender.rewrite.RewriteAppender 22/12/13 08:53:15 INFO DriverDaemon$: == Modules: 22/12/13 08:53:16 INFO DriverDaemon$: Starting prometheus metrics log export timer 22/12/13 08:53:16 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:16 INFO DriverDaemon$: Loaded JDBC drivers in 113 ms 22/12/13 08:53:16 INFO DriverDaemon$: Universe Git Hash: e549631c9fb0772d727af4ef0e40aacb100ac85c 22/12/13 08:53:16 INFO DriverDaemon$: Spark Git Hash: c5b6c26d8ff325aee8cf855c41cd47427d4659b9 22/12/13 08:53:16 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf 22/12/13 08:53:17 WARN RunHelpers$: Missing tag isolation client: java.util.NoSuchElementException: key not found: TagDefinition(clientType,The client type for a request, used for isolating resources for the request.,false,false,List(),DATA_LABEL_UNSPECIFIED) 22/12/13 08:53:17 INFO DatabricksILoop$: Creating throwaway interpreter 22/12/13 08:53:17 INFO MetastoreMonitor$: Internal internal metastore configured (config=DbMetastoreConfig{host=consolidated-westeuropec2-prod-metastore-3.mysql.database.azure.com, port=3306, dbName=organization5367007285973203, user=DPyPuOLqCQ1h3t1I@consolidated-westeuropec2-prod-metastore-3}) 22/12/13 08:53:17 INFO DataSourceFactory$: DataSource Jdbc URL: jdbc:mariadb://consolidated-westeuropec2-prod-metastore-3.mysql.database.azure.com:3306/organization5367007285973203?useSSL=true&sslMode=VERIFY_CA&disableSslHostnameVerification=true&trustServerCertificate=false&serverSslCert=/databricks/common/mysql-ssl-ca-cert.crt 22/12/13 08:53:17 INFO HikariDataSource: metastore-monitor - Starting... 22/12/13 08:53:17 WARN DynamicRpcConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:17 INFO ConcurrentRateLimiterHook: No additional configuration supplied to the concurrent rate-limiter. Defaults would be used. 22/12/13 08:53:17 INFO ConcurrentRateLimiterHook: Concurrent rate-limiter limit - Dry-Run: false | Dimension: WORKSPACE | API: DEFAULT | High: 100 | Low: 50 22/12/13 08:53:17 INFO HikariDataSource: metastore-monitor - Start completed. 22/12/13 08:53:17 INFO ConcurrentRateLimiterHook: Concurrent rate-limiter limit - Dry-Run: false | Dimension: ACCOUNT_ID | API: DEFAULT | High: 100 | Low: 50 22/12/13 08:53:17 INFO DriverCorral: Creating the driver context 22/12/13 08:53:17 INFO DatabricksILoop$: Class Server Dir: /local_disk0/tmp/repl/spark-8689489123715673276-c1e5535d-46b0-43ad-ab4d-b013d962bd84 22/12/13 08:53:17 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf 22/12/13 08:53:17 WARN SparkConf: The configuration key 'spark.akka.frameSize' has been deprecated as of Spark 1.6 and may be removed in the future. Please use the new key 'spark.rpc.message.maxSize' instead. 22/12/13 08:53:17 INFO SparkContext: Running Spark version 3.3.0 22/12/13 08:53:18 INFO ResourceUtils: ============================================================== 22/12/13 08:53:18 INFO ResourceUtils: No custom resources configured for spark.driver. 22/12/13 08:53:18 INFO ResourceUtils: ============================================================== 22/12/13 08:53:18 INFO SparkContext: Submitted application: Databricks Shell 22/12/13 08:53:18 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 3157, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 22/12/13 08:53:18 INFO ResourceProfile: Limiting resource is cpu 22/12/13 08:53:18 INFO ResourceProfileManager: Added ResourceProfile id: 0 22/12/13 08:53:18 INFO SecurityManager: Changing view acls to: root 22/12/13 08:53:18 INFO SecurityManager: Changing modify acls to: root 22/12/13 08:53:18 INFO SecurityManager: Changing view acls groups to: 22/12/13 08:53:18 INFO SecurityManager: Changing modify acls groups to: 22/12/13 08:53:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 22/12/13 08:53:19 INFO Utils: Successfully started service 'sparkDriver' on port 46551. 22/12/13 08:53:19 INFO SparkEnv: Registering MapOutputTracker 22/12/13 08:53:19 INFO SparkEnv: Registering BlockManagerMaster 22/12/13 08:53:19 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 22/12/13 08:53:19 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 22/12/13 08:53:19 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 22/12/13 08:53:19 INFO DiskBlockManager: Created local directory at /local_disk0/blockmgr-872237fc-b343-4b09-9e58-43680242d699 22/12/13 08:53:19 INFO MemoryStore: MemoryStore started with capacity 1047.3 MiB 22/12/13 08:53:19 INFO SparkEnv: Registering OutputCommitCoordinator 22/12/13 08:53:19 INFO SparkContext: Spark configuration: eventLog.rolloverIntervalSeconds=900 libraryDownload.sleepIntervalSeconds=5 libraryDownload.timeoutSeconds=180 spark.akka.frameSize=256 spark.app.name=Databricks Shell spark.app.startTime=1670921597903 spark.cleaner.referenceTracking.blocking=false spark.databricks.acl.client=com.databricks.spark.sql.acl.client.SparkSqlAclClient spark.databricks.acl.provider=com.databricks.sql.acl.ReflectionBackedAclProvider spark.databricks.acl.scim.client=com.databricks.spark.sql.acl.client.DriverToWebappScimClient spark.databricks.automl.serviceEnabled=true spark.databricks.cloudProvider=Azure spark.databricks.cloudfetch.hasRegionSupport=true spark.databricks.cloudfetch.requesterClassName=*********(redacted) spark.databricks.clusterSource=JOB spark.databricks.clusterUsageTags.attribute_tag_budget= spark.databricks.clusterUsageTags.attribute_tag_dust_execution_env= spark.databricks.clusterUsageTags.attribute_tag_dust_maintainer= spark.databricks.clusterUsageTags.attribute_tag_dust_suite= spark.databricks.clusterUsageTags.attribute_tag_service= spark.databricks.clusterUsageTags.autoTerminationMinutes=0 spark.databricks.clusterUsageTags.azureSubscriptionId=25f80d07-1543-4ea2-9023-4fdfe28ccf52 spark.databricks.clusterUsageTags.cloudProvider=Azure spark.databricks.clusterUsageTags.clusterAllTags=[{"key":"Vendor","value":"Databricks"},{"key":"Creator","value":"enrico.mosca@atos.net"},{"key":"ClusterName","value":"job-673644476546964-run-7315"},{"key":"ClusterId","value":"1213-084859-qjg9a4ar"},{"key":"JobId","value":"673644476546964"},{"key":"RunName","value":"main"},{"key":"DatabricksEnvironment","value":"workerenv-5367007285973203"}] spark.databricks.clusterUsageTags.clusterAvailability=ON_DEMAND_AZURE spark.databricks.clusterUsageTags.clusterCreator=JobLauncher spark.databricks.clusterUsageTags.clusterFirstOnDemand=1 spark.databricks.clusterUsageTags.clusterGeneration=0 spark.databricks.clusterUsageTags.clusterId=1213-084859-qjg9a4ar spark.databricks.clusterUsageTags.clusterLogDeliveryEnabled=false spark.databricks.clusterUsageTags.clusterLogDestination= spark.databricks.clusterUsageTags.clusterMetastoreAccessType=RDS_DIRECT spark.databricks.clusterUsageTags.clusterName=job-673644476546964-run-7315 spark.databricks.clusterUsageTags.clusterNoDriverDaemon=false spark.databricks.clusterUsageTags.clusterNodeType=Standard_F4 spark.databricks.clusterUsageTags.clusterNumCustomTags=0 spark.databricks.clusterUsageTags.clusterNumSshKeys=0 spark.databricks.clusterUsageTags.clusterOwnerOrgId=5367007285973203 spark.databricks.clusterUsageTags.clusterOwnerUserId=*********(redacted) spark.databricks.clusterUsageTags.clusterPinned=false spark.databricks.clusterUsageTags.clusterPythonVersion=3 spark.databricks.clusterUsageTags.clusterResourceClass=default spark.databricks.clusterUsageTags.clusterScalingType=fixed_size spark.databricks.clusterUsageTags.clusterSizeType=VM_CONTAINER spark.databricks.clusterUsageTags.clusterSku=STANDARD_SKU spark.databricks.clusterUsageTags.clusterSpotBidMaxPrice=-1.0 spark.databricks.clusterUsageTags.clusterState=Pending spark.databricks.clusterUsageTags.clusterStateMessage=Starting Spark spark.databricks.clusterUsageTags.clusterTargetWorkers=1 spark.databricks.clusterUsageTags.clusterUnityCatalogMode=CUSTOM spark.databricks.clusterUsageTags.clusterWorkers=1 spark.databricks.clusterUsageTags.containerType=LXC spark.databricks.clusterUsageTags.dataPlaneRegion=westeurope spark.databricks.clusterUsageTags.driverContainerId=ca234d8d2f69436ea366307001e58826 spark.databricks.clusterUsageTags.driverContainerPrivateIp=10.250.16.72 spark.databricks.clusterUsageTags.driverInstanceId=11e101ae5c204f56a5a7bd7969b18ad3 spark.databricks.clusterUsageTags.driverInstancePrivateIp=10.250.16.136 spark.databricks.clusterUsageTags.driverNodeType=Standard_F4 spark.databricks.clusterUsageTags.effectiveSparkVersion=11.2.x-scala2.12 spark.databricks.clusterUsageTags.enableCredentialPassthrough=*********(redacted) spark.databricks.clusterUsageTags.enableDfAcls=false spark.databricks.clusterUsageTags.enableElasticDisk=true spark.databricks.clusterUsageTags.enableGlueCatalogCredentialPassthrough=*********(redacted) spark.databricks.clusterUsageTags.enableJdbcAutoStart=true spark.databricks.clusterUsageTags.enableJobsAutostart=true spark.databricks.clusterUsageTags.enableLocalDiskEncryption=false spark.databricks.clusterUsageTags.enableSqlAclsOnly=false spark.databricks.clusterUsageTags.hailEnabled=false spark.databricks.clusterUsageTags.ignoreTerminationEventInAlerting=false spark.databricks.clusterUsageTags.instanceWorkerEnvId=workerenv-5367007285973203 spark.databricks.clusterUsageTags.instanceWorkerEnvNetworkType=vnet-injection spark.databricks.clusterUsageTags.isDpCpPrivateLinkEnabled=false spark.databricks.clusterUsageTags.isIMv2Enabled=false spark.databricks.clusterUsageTags.isServicePrincipalCluster=false spark.databricks.clusterUsageTags.isSingleUserCluster=*********(redacted) spark.databricks.clusterUsageTags.managedResourceGroup=RG-managed-DBW-acc-we-001-akfykxrrs7cpk spark.databricks.clusterUsageTags.ngrokNpipEnabled=true spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2=0 spark.databricks.clusterUsageTags.numPerGlobalInitScriptsV2=0 spark.databricks.clusterUsageTags.orgId=5367007285973203 spark.databricks.clusterUsageTags.privateLinkEnabled=false spark.databricks.clusterUsageTags.region=westeurope spark.databricks.clusterUsageTags.sparkEnvVarContainsBacktick=false spark.databricks.clusterUsageTags.sparkEnvVarContainsDollarSign=false spark.databricks.clusterUsageTags.sparkEnvVarContainsDoubleQuotes=false spark.databricks.clusterUsageTags.sparkEnvVarContainsEscape=false spark.databricks.clusterUsageTags.sparkEnvVarContainsNewline=false spark.databricks.clusterUsageTags.sparkEnvVarContainsSingleQuotes=false spark.databricks.clusterUsageTags.sparkMasterUrlType=*********(redacted) spark.databricks.clusterUsageTags.sparkVersion=11.2.x-scala2.12 spark.databricks.clusterUsageTags.userId=*********(redacted) spark.databricks.clusterUsageTags.userProvidedRemoteVolumeCount=*********(redacted) spark.databricks.clusterUsageTags.userProvidedRemoteVolumeSizeGb=*********(redacted) spark.databricks.clusterUsageTags.userProvidedRemoteVolumeType=*********(redacted) spark.databricks.clusterUsageTags.userProvidedSparkVersion=*********(redacted) spark.databricks.clusterUsageTags.workerEnvironmentId=workerenv-5367007285973203 spark.databricks.credential.aws.secretKey.redactor=*********(redacted) spark.databricks.credential.redactor=*********(redacted) spark.databricks.credential.scope.fs.adls.gen2.tokenProviderClassName=*********(redacted) spark.databricks.credential.scope.fs.gs.auth.access.tokenProviderClassName=*********(redacted) spark.databricks.credential.scope.fs.impl=*********(redacted) spark.databricks.credential.scope.fs.s3a.tokenProviderClassName=*********(redacted) spark.databricks.delta.logStore.crossCloud.fatal=true spark.databricks.delta.multiClusterWrites.enabled=true spark.databricks.delta.preview.enabled=true spark.databricks.driverNfs.clusterWidePythonLibsEnabled=true spark.databricks.driverNfs.enabled=true spark.databricks.driverNfs.pathSuffix=.ephemeral_nfs spark.databricks.driverNodeTypeId=Standard_F4 spark.databricks.enablePublicDbfsFuse=false spark.databricks.eventLog.dir=eventlogs spark.databricks.eventLog.enabled=true spark.databricks.eventLog.listenerClassName=com.databricks.backend.daemon.driver.DBCEventLoggingListener spark.databricks.io.directoryCommit.enableLogicalDelete=false spark.databricks.managedCatalog.clientClassName=com.databricks.managedcatalog.ManagedCatalogClientImpl spark.databricks.metrics.filesystem_io_metrics=true spark.databricks.overrideDefaultCommitProtocol=org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol spark.databricks.passthrough.adls.gen2.tokenProviderClassName=*********(redacted) spark.databricks.passthrough.adls.tokenProviderClassName=*********(redacted) spark.databricks.passthrough.glue.credentialsProviderFactoryClassName=*********(redacted) spark.databricks.passthrough.glue.executorServiceFactoryClassName=*********(redacted) spark.databricks.passthrough.oauth.refresher.impl=*********(redacted) spark.databricks.passthrough.s3a.threadPoolExecutor.factory.class=com.databricks.backend.daemon.driver.aws.S3APassthroughThreadPoolExecutorFactory spark.databricks.passthrough.s3a.tokenProviderClassName=*********(redacted) spark.databricks.preemption.enabled=true spark.databricks.python.defaultPythonRepl=ipykernel spark.databricks.redactor=com.databricks.spark.util.DatabricksSparkLogRedactorProxy spark.databricks.repl.enableClassFileCleanup=true spark.databricks.secret.envVar.keys.toRedact=*********(redacted) spark.databricks.secret.sparkConf.keys.toRedact=*********(redacted) spark.databricks.service.dbutils.repl.backend=com.databricks.dbconnect.ReplDBUtils spark.databricks.service.dbutils.server.backend=com.databricks.dbconnect.SparkServerDBUtils spark.databricks.session.share=false spark.databricks.sparkContextId=8689489123715673276 spark.databricks.sql.configMapperClass=com.databricks.dbsql.config.SqlConfigMapperBridge spark.databricks.tahoe.logStore.aws.class=com.databricks.tahoe.store.MultiClusterLogStore spark.databricks.tahoe.logStore.azure.class=com.databricks.tahoe.store.AzureLogStore spark.databricks.tahoe.logStore.class=com.databricks.tahoe.store.DelegatingLogStore spark.databricks.tahoe.logStore.gcp.class=com.databricks.tahoe.store.GCPLogStore spark.databricks.workerNodeTypeId=Standard_F4 spark.databricks.workspaceUrl=*********(redacted) spark.databricks.wsfsPublicPreview=true spark.delta.sharing.profile.provider.class=*********(redacted) spark.driver.allowMultipleContexts=false spark.driver.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED spark.driver.host=10.250.16.72 spark.driver.maxResultSize=4g spark.driver.port=46551 spark.driver.tempDirectory=/local_disk0/tmp spark.eventLog.enabled=false spark.executor.extraClassPath=/databricks/spark/dbconf/log4j/executor:/databricks/spark/dbconf/jets3t/:/databricks/spark/dbconf/hadoop:/databricks/hive/conf:/databricks/jars/* spark.executor.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Djava.io.tmpdir=/local_disk0/tmp -XX:ReservedCodeCacheSize=512m -XX:+UseCodeCacheFlushing -XX:PerMethodRecompilationCutoff=-1 -XX:PerBytecodeRecompilationCutoff=-1 -Djava.security.properties=/databricks/spark/dbconf/java/extra.security -XX:-UseContainerSupport -XX:+PrintFlagsFinal -XX:+PrintGCDateStamps -XX:+PrintGCDetails -verbose:gc -Xss4m -Djava.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni -Djavax.xml.datatype.DatatypeFactory=com.sun.org.apache.xerces.internal.jaxp.datatype.DatatypeFactoryImpl -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl -Djavax.xml.validation.SchemaFactory:http://www.w3.org/2001/XMLSchema=com.sun.org.apache.xerces.internal.jaxp.validation.XMLSchemaFactory -Dorg.xml.sax.driver=com.sun.org.apache.xerces.internal.parsers.SAXParser -Dorg.w3c.dom.DOMImplementationSourceList=com.sun.org.apache.xerces.internal.dom.DOMXSImplementationSourceImpl -Djavax.net.ssl.sessionCacheSize=10000 -Dscala.reflect.runtime.disable.typetag.cache=true -Dcom.google.cloud.spark.bigquery.repackaged.io.netty.tryReflectionSetAccessible=true -Dlog4j2.formatMsgNoLookups=true -Ddatabricks.serviceName=spark-executor-1 spark.executor.id=driver spark.executor.memory=3157m spark.executor.tempDirectory=/local_disk0/tmp spark.extraListeners=com.databricks.backend.daemon.driver.DBCEventLoggingListener spark.files.fetchFailure.unRegisterOutputOnHost=true spark.files.overwrite=true spark.files.useFetchCache=false spark.hadoop.databricks.dbfs.client.version=v2 spark.hadoop.databricks.fs.perfMetrics.enable=true spark.hadoop.databricks.s3.amazonS3Client.cache.enabled=false spark.hadoop.databricks.s3.create.deleteUnnecessaryFakeDirectories=false spark.hadoop.databricks.s3.verifyBucketExists.enabled=false spark.hadoop.databricks.s3commit.client.sslTrustAll=false spark.hadoop.fs.AbstractFileSystem.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS spark.hadoop.fs.abfs.impl=shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemHadoop3 spark.hadoop.fs.abfs.impl.disable.cache=true spark.hadoop.fs.abfss.impl=shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.SecureAzureBlobFileSystemHadoop3 spark.hadoop.fs.abfss.impl.disable.cache=true spark.hadoop.fs.adl.impl=com.databricks.adl.AdlFileSystem spark.hadoop.fs.adl.impl.disable.cache=true spark.hadoop.fs.azure.authorization.caching.enable=false spark.hadoop.fs.azure.cache.invalidator.type=com.databricks.encryption.utils.CacheInvalidatorImpl spark.hadoop.fs.azure.skip.metrics=true spark.hadoop.fs.azure.user.agent.prefix=*********(redacted) spark.hadoop.fs.cpfs-abfss.impl=*********(redacted) spark.hadoop.fs.cpfs-abfss.impl.disable.cache=true spark.hadoop.fs.cpfs-adl.impl=*********(redacted) spark.hadoop.fs.cpfs-adl.impl.disable.cache=true spark.hadoop.fs.cpfs-s3.impl=*********(redacted) spark.hadoop.fs.cpfs-s3a.impl=*********(redacted) spark.hadoop.fs.cpfs-s3n.impl=*********(redacted) spark.hadoop.fs.dbfs.impl=com.databricks.backend.daemon.data.client.DbfsHadoop3 spark.hadoop.fs.dbfsartifacts.impl=com.databricks.backend.daemon.data.client.DBFSV1 spark.hadoop.fs.fcfs-abfs.impl=*********(redacted) spark.hadoop.fs.fcfs-abfs.impl.disable.cache=true spark.hadoop.fs.fcfs-abfss.impl=*********(redacted) spark.hadoop.fs.fcfs-abfss.impl.disable.cache=true spark.hadoop.fs.fcfs-s3.impl=*********(redacted) spark.hadoop.fs.fcfs-s3.impl.disable.cache=true spark.hadoop.fs.fcfs-s3a.impl=*********(redacted) spark.hadoop.fs.fcfs-s3a.impl.disable.cache=true spark.hadoop.fs.fcfs-s3n.impl=*********(redacted) spark.hadoop.fs.fcfs-s3n.impl.disable.cache=true spark.hadoop.fs.fcfs-wasb.impl=*********(redacted) spark.hadoop.fs.fcfs-wasb.impl.disable.cache=true spark.hadoop.fs.fcfs-wasbs.impl=*********(redacted) spark.hadoop.fs.fcfs-wasbs.impl.disable.cache=true spark.hadoop.fs.file.impl=com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem spark.hadoop.fs.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem spark.hadoop.fs.gs.impl.disable.cache=true spark.hadoop.fs.gs.outputstream.upload.chunk.size=16777216 spark.hadoop.fs.idbfs.impl=com.databricks.io.idbfs.IdbfsFileSystem spark.hadoop.fs.mcfs-s3a.impl=com.databricks.sql.acl.fs.ManagedCatalogFileSystem spark.hadoop.fs.mlflowdbfs.impl=com.databricks.mlflowdbfs.MlflowdbfsFileSystem spark.hadoop.fs.s3.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3 spark.hadoop.fs.s3a.assumed.role.credentials.provider=*********(redacted) spark.hadoop.fs.s3a.attempts.maximum=10 spark.hadoop.fs.s3a.block.size=67108864 spark.hadoop.fs.s3a.connection.maximum=200 spark.hadoop.fs.s3a.connection.timeout=50000 spark.hadoop.fs.s3a.fast.upload=true spark.hadoop.fs.s3a.fast.upload.active.blocks=32 spark.hadoop.fs.s3a.fast.upload.default=true spark.hadoop.fs.s3a.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3 spark.hadoop.fs.s3a.max.total.tasks=1000 spark.hadoop.fs.s3a.multipart.size=10485760 spark.hadoop.fs.s3a.multipart.threshold=104857600 spark.hadoop.fs.s3a.retry.limit=20 spark.hadoop.fs.s3a.retry.throttle.interval=500ms spark.hadoop.fs.s3a.threads.max=136 spark.hadoop.fs.s3n.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3 spark.hadoop.fs.stage.impl=com.databricks.backend.daemon.driver.managedcatalog.PersonalStagingFileSystem spark.hadoop.fs.stage.impl.disable.cache=true spark.hadoop.fs.wasb.impl=shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem spark.hadoop.fs.wasb.impl.disable.cache=true spark.hadoop.fs.wasbs.impl=shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem spark.hadoop.fs.wasbs.impl.disable.cache=true spark.hadoop.hive.hmshandler.retry.attempts=10 spark.hadoop.hive.hmshandler.retry.interval=2000 spark.hadoop.hive.server2.enable.doAs=false spark.hadoop.hive.server2.idle.operation.timeout=7200000 spark.hadoop.hive.server2.idle.session.timeout=900000 spark.hadoop.hive.server2.keystore.password=*********(redacted) spark.hadoop.hive.server2.keystore.path=/databricks/keys/jetty-ssl-driver-keystore.jks spark.hadoop.hive.server2.session.check.interval=60000 spark.hadoop.hive.server2.thrift.http.cookie.auth.enabled=false spark.hadoop.hive.server2.thrift.http.port=10000 spark.hadoop.hive.server2.transport.mode=http spark.hadoop.hive.server2.use.SSL=true spark.hadoop.hive.warehouse.subdir.inherit.perms=false spark.hadoop.mapred.output.committer.class=com.databricks.backend.daemon.data.client.DirectOutputCommitter spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version=2 spark.hadoop.parquet.abfs.readahead.optimization.enabled=true spark.hadoop.parquet.block.size.row.check.max=10 spark.hadoop.parquet.block.size.row.check.min=10 spark.hadoop.parquet.filter.columnindex.enabled=false spark.hadoop.parquet.memory.pool.ratio=0.5 spark.hadoop.parquet.page.metadata.validation.enabled=true spark.hadoop.parquet.page.size.check.estimate=false spark.hadoop.parquet.page.verify-checksum.enabled=true spark.hadoop.parquet.page.write-checksum.enabled=true spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.enabled=false spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.throwsException=false spark.hadoop.spark.driverproxy.customHeadersToProperties=*********(redacted) spark.hadoop.spark.hadoop.aws.glue.cache.db.size=1000 spark.hadoop.spark.hadoop.aws.glue.cache.db.ttl-mins=30 spark.hadoop.spark.hadoop.aws.glue.cache.table.size=1000 spark.hadoop.spark.hadoop.aws.glue.cache.table.ttl-mins=30 spark.hadoop.spark.sql.parquet.output.committer.class=org.apache.spark.sql.parquet.DirectParquetOutputCommitter spark.hadoop.spark.sql.sources.outputCommitterClass=com.databricks.backend.daemon.data.client.MapReduceDirectOutputCommitter spark.home=/databricks/spark spark.logConf=true spark.master=spark://10.250.16.72:7077 spark.metrics.conf=/databricks/spark/conf/metrics.properties spark.r.backendConnectionTimeout=604800 spark.r.numRBackendThreads=1 spark.rdd.compress=true spark.repl.class.outputDir=/local_disk0/tmp/repl/spark-8689489123715673276-c1e5535d-46b0-43ad-ab4d-b013d962bd84 spark.rpc.message.maxSize=256 spark.scheduler.listenerbus.eventqueue.capacity=20000 spark.scheduler.mode=FAIR spark.serializer.objectStreamReset=100 spark.shuffle.manager=SORT spark.shuffle.memoryFraction=0.2 spark.shuffle.reduceLocality.enabled=false spark.shuffle.service.enabled=true spark.shuffle.service.port=4048 spark.sparklyr-backend.threads=1 spark.sparkr.use.daemon=false spark.speculation=false spark.speculation.multiplier=3 spark.speculation.quantile=0.9 spark.sql.allowMultipleContexts=false spark.sql.hive.convertCTAS=true spark.sql.hive.convertMetastoreParquet=true spark.sql.hive.metastore.jars=/databricks/databricks-hive/* spark.sql.hive.metastore.sharedPrefixes=org.mariadb.jdbc,com.mysql.jdbc,org.postgresql,com.microsoft.sqlserver,microsoft.sql.DateTimeOffset,microsoft.sql.Types,com.databricks,com.codahale,com.fasterxml.jackson,shaded.databricks spark.sql.hive.metastore.version=0.13.0 spark.sql.legacy.createHiveTableByDefault=false spark.sql.parquet.cacheMetadata=true spark.sql.parquet.compression.codec=snappy spark.sql.sources.commitProtocolClass=com.databricks.sql.transaction.directory.DirectoryAtomicCommitProtocol spark.sql.sources.default=delta spark.sql.streaming.checkpointFileManagerClass=com.databricks.spark.sql.streaming.DatabricksCheckpointFileManager spark.sql.streaming.stopTimeout=15s spark.sql.warehouse.dir=*********(redacted) spark.storage.blockManagerTimeoutIntervalMs=300000 spark.storage.memoryFraction=0.5 spark.streaming.driver.writeAheadLog.allowBatching=true spark.streaming.driver.writeAheadLog.closeFileAfterWrite=true spark.task.reaper.enabled=true spark.task.reaper.killTimeout=60s spark.ui.port=40001 spark.worker.aioaLazyConfig.dbfsReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosDbfsReadinessCheckClient spark.worker.aioaLazyConfig.iamReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosIamRoleCheckClient spark.worker.cleanup.enabled=false 22/12/13 08:53:19 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set. 22/12/13 08:53:19 INFO log: Logging initialized @11141ms to org.eclipse.jetty.util.log.Slf4jLog 22/12/13 08:53:20 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_345-b01 22/12/13 08:53:20 INFO Server: Started @11772ms 22/12/13 08:53:20 INFO AbstractConnector: Started ServerConnector@671f545b{HTTP/1.1, (http/1.1)}{10.250.16.72:40001} 22/12/13 08:53:20 INFO Utils: Successfully started service 'SparkUI' on port 40001. 22/12/13 08:53:20 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@18f13756{/,null,AVAILABLE,@Spark} 22/12/13 08:53:21 WARN FairSchedulableBuilder: Fair Scheduler configuration file not found so jobs will be scheduled in FIFO order. To use fair scheduling, configure pools in fairscheduler.xml or set spark.scheduler.allocation.file to a file that contains the configuration. 22/12/13 08:53:21 INFO FairSchedulableBuilder: Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1 22/12/13 08:53:21 INFO DatabricksEdgeConfigs: serverlessEnabled : false 22/12/13 08:53:21 INFO DatabricksEdgeConfigs: perfPackEnabled : false 22/12/13 08:53:21 INFO DatabricksEdgeConfigs: classicSqlEnabled : false 22/12/13 08:53:22 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://10.250.16.72:7077... 22/12/13 08:53:22 INFO TransportClientFactory: Successfully created connection to /10.250.16.72:7077 after 114 ms (0 ms spent in bootstraps) 22/12/13 08:53:22 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20221213085322-0000 22/12/13 08:53:22 INFO TaskSchedulerImpl: Task preemption enabled. 22/12/13 08:53:22 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39233. 22/12/13 08:53:22 INFO NettyBlockTransferService: Server created on 10.250.16.72:39233 22/12/13 08:53:22 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 22/12/13 08:53:22 INFO BlockManager: external shuffle service port = 4048 22/12/13 08:53:22 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.250.16.72, 39233, None) 22/12/13 08:53:22 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20221213085322-0000/0 on worker-20221213085318-10.250.16.73-34665 (10.250.16.73:34665) with 4 core(s) 22/12/13 08:53:22 INFO BlockManagerMasterEndpoint: Registering block manager 10.250.16.72:39233 with 1047.3 MiB RAM, BlockManagerId(driver, 10.250.16.72, 39233, None) 22/12/13 08:53:22 INFO StandaloneSchedulerBackend: Granted executor ID app-20221213085322-0000/0 on hostPort 10.250.16.73:34665 with 4 core(s), 3.1 GiB RAM 22/12/13 08:53:22 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.250.16.72, 39233, None) 22/12/13 08:53:22 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.250.16.72, 39233, None) 22/12/13 08:53:23 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20221213085322-0000/0 is now RUNNING 22/12/13 08:53:23 INFO DBCEventLoggingListener: Initializing DBCEventLoggingListener 22/12/13 08:53:23 INFO DBCEventLoggingListener: Logging events to eventlogs/8689489123715673276/eventlog 22/12/13 08:53:23 INFO SparkContext: Registered listener com.databricks.backend.daemon.driver.DBCEventLoggingListener 22/12/13 08:53:24 INFO DatabricksILoop$: Finished creating throwaway interpreter 22/12/13 08:53:24 INFO ContextHandler: Stopped o.e.j.s.ServletContextHandler@18f13756{/,null,STOPPED,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6fd2acf5{/jobs,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@368ff8be{/jobs/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@30adae45{/jobs/job,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@66e62fc4{/jobs/job/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@23aaa756{/stages,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7a92827f{/stages/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1e18876d{/stages/stage,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5c4516ec{/stages/stage/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4d93e5bf{/stages/pool,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@68303c3e{/stages/pool/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7c2e88b9{/storage,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3c5044fa{/storage/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@b386a17{/storage/rdd,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2213854b{/storage/rdd/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@33bd9ac3{/environment,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42028589{/environment/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@fc21ff4{/executors,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@e1c91cd{/executors/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@764a3867{/executors/threadDump,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@67e1a5fd{/executors/threadDump/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6cb3463b{/executors/heapHistogram,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4536a09a{/executors/heapHistogram/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@16cbba0f{/static,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@742dbac8{/,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3538a129{/api,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@19855799{/jobs/job/kill,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5020b59f{/stages/stage/kill,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4e3a6c83{/metrics/json,null,AVAILABLE,@Spark} 22/12/13 08:53:24 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 22/12/13 08:53:24 INFO SparkContext: Loading Spark Service RPC Server. Classloader stack:List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@77020328, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@22ee1ad7, sun.misc.Launcher$AppClassLoader@45afc369, sun.misc.Launcher$ExtClassLoader@cecf639) 22/12/13 08:53:25 INFO SparkServiceRPCServer: Initializing Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@77020328, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@22ee1ad7, sun.misc.Launcher$AppClassLoader@45afc369, sun.misc.Launcher$ExtClassLoader@cecf639) 22/12/13 08:53:25 INFO SparkServiceRPCServer: Starting Spark Service RPC Server 22/12/13 08:53:25 INFO SparkServiceRPCServer: Starting Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@77020328, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@22ee1ad7, sun.misc.Launcher$AppClassLoader@45afc369, sun.misc.Launcher$ExtClassLoader@cecf639) 22/12/13 08:53:25 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_345-b01 22/12/13 08:53:25 INFO AbstractConnector: Started ServerConnector@10e4cc6{HTTP/1.1, (http/1.1)}{0.0.0.0:15001} 22/12/13 08:53:25 INFO Server: Started @16486ms 22/12/13 08:53:25 INFO DatabricksILoop$: Successfully registered spark metrics in Prometheus registry 22/12/13 08:53:25 INFO DatabricksILoop$: Successfully initialized SparkContext 22/12/13 08:53:25 INFO SharedState: Scheduler stats enabled. 22/12/13 08:53:25 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir. 22/12/13 08:53:25 INFO SharedState: Warehouse path is 'dbfs:/user/hive/warehouse'. 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@8ad6d29{/storage/iocache,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4e35a219{/storage/iocache/json,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2748bd6c{/SQL,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7ab3f194{/SQL/json,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@8ab231{/SQL/execution,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6ba0dcba{/SQL/execution/json,null,AVAILABLE,@Spark} 22/12/13 08:53:25 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7e741d6b{/static/sql,null,AVAILABLE,@Spark} 22/12/13 08:53:26 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:26 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:29 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:30 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.250.16.73:59624) with ID 0, ResourceProfileId 0 22/12/13 08:53:31 INFO BlockManagerMasterEndpoint: Registering block manager 10.250.16.73:33801 with 1504.5 MiB RAM, BlockManagerId(0, 10.250.16.73, 33801, None) 22/12/13 08:53:31 INFO DatabricksMountsStore: Mount store initialization: Attempting to get the list of mounts from metadata manager of DBFS 22/12/13 08:53:31 INFO log: Logging initialized @23185ms to shaded.v9_4.org.eclipse.jetty.util.log.Slf4jLog 22/12/13 08:53:32 INFO HikariDataSource: metastore-monitor - Shutdown initiated... 22/12/13 08:53:32 INFO TypeUtil: JVM Runtime does not support Modules 22/12/13 08:53:33 INFO DatabricksMountsStore: Mount store initialization: Received a list of 5 mounts accessible from metadata manager of DBFS 22/12/13 08:53:33 INFO DatabricksMountsStore: Updated mounts cache. Changes: List((+,DbfsMountPoint(s3a://databricks-datasets-oregon/, /databricks-datasets)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-tracking)), (+,DbfsMountPoint(wasbs://dbstoragebygxlx6q7aamc.blob.core.windows.net/5367007285973203, /databricks-results)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-registry)), (+,DbfsMountPoint(wasbs://dbstoragebygxlx6q7aamc.blob.core.windows.net/5367007285973203, /))) 22/12/13 08:53:33 INFO DatabricksFileSystemV2Factory: Creating wasbs file system for wasbs://root@dbstoragebygxlx6q7aamc.blob.core.windows.net 22/12/13 08:53:33 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections 22/12/13 08:53:34 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1 22/12/13 08:53:34 INFO DbfsHadoop3: Initialized DBFS with DBFSV2 as the delegate. 22/12/13 08:53:34 INFO DriverCorral: Creating the driver context 22/12/13 08:53:34 INFO DriverCorral: Create default DBUtils 22/12/13 08:53:34 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 22/12/13 08:53:34 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3d2bcf0a{/StreamingQuery,null,AVAILABLE,@Spark} 22/12/13 08:53:34 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@270a113f{/StreamingQuery/json,null,AVAILABLE,@Spark} 22/12/13 08:53:34 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4821e1a3{/StreamingQuery/statistics,null,AVAILABLE,@Spark} 22/12/13 08:53:34 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@70b437bb{/StreamingQuery/statistics/json,null,AVAILABLE,@Spark} 22/12/13 08:53:34 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@78395e28{/static/sql,null,AVAILABLE,@Spark} 22/12/13 08:53:34 INFO JettyServer$: Creating thread pool with name ... 22/12/13 08:53:34 INFO JettyServer$: Thread pool created 22/12/13 08:53:34 INFO JettyServer$: Creating thread pool with name ... 22/12/13 08:53:34 INFO JettyServer$: Thread pool created 22/12/13 08:53:34 INFO DriverDaemon: Starting driver daemon... 22/12/13 08:53:34 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf 22/12/13 08:53:34 WARN SparkConf: The configuration key 'spark.akka.frameSize' has been deprecated as of Spark 1.6 and may be removed in the future. Please use the new key 'spark.rpc.message.maxSize' instead. 22/12/13 08:53:34 INFO DriverDaemon$: Attempting to run: 'set up ttyd daemon' 22/12/13 08:53:34 INFO DriverDaemon$: Attempting to run: 'Configuring RStudio daemon' 22/12/13 08:53:34 INFO DriverDaemon$: Resetting the default python executable 22/12/13 08:53:34 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/cluster_libraries/python, -p, /databricks/python/bin/python, --no-download, --no-setuptools, --no-wheel) 22/12/13 08:53:35 INFO Utils: resolved command to be run: WrappedArray(getconf, PAGESIZE) 22/12/13 08:53:36 INFO DatabricksUtils: created python virtualenv: /local_disk0/.ephemeral_nfs/cluster_libraries/python 22/12/13 08:53:36 INFO Utils: resolved command to be run: List(/databricks/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs))) 22/12/13 08:53:36 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib())) 22/12/13 08:53:36 INFO DatabricksUtils: created sites.pth at /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/sites.pth 22/12/13 08:53:36 INFO ClusterWidePythonEnvManager: Registered /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@4b05a9b1 22/12/13 08:53:36 INFO DriverDaemon$: Attempting to run: 'Update root virtualenv' 22/12/13 08:53:36 INFO DriverDaemon$: Finished updating /etc/environment 22/12/13 08:53:36 INFO DriverDaemon$$anon$1: Message out thread ready 22/12/13 08:53:36 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_345-b01 22/12/13 08:53:36 INFO AbstractConnector: Started ServerConnector@51fadd20{HTTP/1.1, (http/1.1)}{0.0.0.0:6061} 22/12/13 08:53:36 INFO Server: Started @27575ms 22/12/13 08:53:36 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_345-b01 22/12/13 08:53:36 INFO SslContextFactory: x509=X509@2296ed0d(1,h=[az-westeurope-c2.workers.prod.ns.databricks.com],a=[],w=[]) for Server@48306567[provider=null,keyStore=null,trustStore=null] 22/12/13 08:53:36 WARN config: Weak cipher suite TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA enabled for Server@48306567[provider=null,keyStore=null,trustStore=null] 22/12/13 08:53:36 WARN config: Weak cipher suite TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA enabled for Server@48306567[provider=null,keyStore=null,trustStore=null] 22/12/13 08:53:36 INFO AbstractConnector: Started ServerConnector@76140ca5{SSL, (ssl, http/1.1)}{0.0.0.0:6062} 22/12/13 08:53:36 INFO Server: Started @27671ms 22/12/13 08:53:36 INFO DriverDaemon: Started comm channel server 22/12/13 08:53:36 INFO DriverDaemon: Driver daemon started. 22/12/13 08:53:36 WARN DynamicInfoServiceConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:36 WARN FeatureFlagRegister$$anon$1: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:36 WARN FeatureFlagRegister$$anon$2: REGION environment variable is not defined. getConfForCurrentRegion will always return default value 22/12/13 08:53:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:37 INFO DriverCorral: Loading the root classloader 22/12/13 08:53:37 INFO DriverCorral: Starting sql repl ReplId-356f3-ef637-0a665-4 22/12/13 08:53:37 INFO DriverCorral: Starting sql repl ReplId-6fa5a-146c8-89180-b 22/12/13 08:53:37 INFO DriverCorral: Starting sql repl ReplId-55d45-7c702-e1fae-6 22/12/13 08:53:37 INFO SQLDriverWrapper: setupRepl:ReplId-356f3-ef637-0a665-4: finished to load 22/12/13 08:53:37 INFO SQLDriverWrapper: setupRepl:ReplId-55d45-7c702-e1fae-6: finished to load 22/12/13 08:53:37 INFO SQLDriverWrapper: setupRepl:ReplId-6fa5a-146c8-89180-b: finished to load 22/12/13 08:53:37 INFO DriverCorral: Starting sql repl ReplId-795cb-c105b-c6fb9-3 22/12/13 08:53:37 INFO SQLDriverWrapper: setupRepl:ReplId-795cb-c105b-c6fb9-3: finished to load 22/12/13 08:53:37 INFO DriverCorral: Starting sql repl ReplId-4b103-64a87-1bab1-b 22/12/13 08:53:37 INFO SQLDriverWrapper: setupRepl:ReplId-4b103-64a87-1bab1-b: finished to load 22/12/13 08:53:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:53:37 INFO DriverCorral: Starting r repl ReplId-5c836-91590-09603-d 22/12/13 08:53:38 INFO ROutputStreamHandler: Connection succeeded on port 33251 22/12/13 08:53:38 INFO ROutputStreamHandler: Connection succeeded on port 41623 22/12/13 08:53:38 INFO RDriverLocal: 1. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: object created with for ReplId-5c836-91590-09603-d. 22/12/13 08:53:38 INFO RDriverLocal: 2. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: initializing ... 22/12/13 08:53:38 INFO RDriverLocal: 3. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: started RBackend thread on port 45155 22/12/13 08:53:38 INFO RDriverLocal: 4. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: waiting for SparkR to be installed ... 22/12/13 08:53:41 WARN JettyServer$Builder: Unexpected exception: com.databricks.backend.common.rpc.DriverMessages$ShouldUseAutoscalingInfo com.databricks.backend.common.rpc.DriverMessages$ShouldUseAutoscalingInfo at com.databricks.backend.daemon.driver.DriverCorral.com$databricks$backend$daemon$driver$DriverCorral$$handleRPCRequest(DriverCorral.scala:1079) at com.databricks.backend.daemon.driver.DriverCorral$$anonfun$receive$1.applyOrElse(DriverCorral.scala:1125) at com.databricks.backend.daemon.driver.DriverCorral$$anonfun$receive$1.applyOrElse(DriverCorral.scala:1123) at com.databricks.rpc.ServerBackend.$anonfun$internalReceive$2(ServerBackend.scala:120) at com.databricks.rpc.ServerBackend$$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:147) at com.databricks.rpc.ServerBackend$$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:147) at com.databricks.rpc.ServerBackend.$anonfun$internalReceive$1(ServerBackend.scala:102) at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:515) at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:609) at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:630) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:377) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:108) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:375) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:372) at com.databricks.rpc.ServerBackend.withAttributionContext(ServerBackend.scala:24) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:420) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:405) at com.databricks.rpc.ServerBackend.withAttributionTags(ServerBackend.scala:24) at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:604) at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:524) at com.databricks.rpc.ServerBackend.recordOperationWithResultTags(ServerBackend.scala:24) at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:515) at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:487) at com.databricks.rpc.ServerBackend.recordOperation(ServerBackend.scala:24) at com.databricks.rpc.ServerBackend.internalReceive(ServerBackend.scala:101) at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleRPC$2(JettyServer.scala:976) at scala.util.Try$.apply(Try.scala:213) at com.databricks.rpc.JettyServer$RequestManager.handleRPC(JettyServer.scala:976) at com.databricks.rpc.JettyServer$RequestManager.handleRequestAndRespond(JettyServer.scala:894) at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleHttp$3(JettyServer.scala:529) at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleHttp$3$adapted(JettyServer.scala:511) at com.databricks.logging.activity.ActivityContextFactory$.$anonfun$withActivity$1(ActivityContextFactory.scala:45) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:377) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:108) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:375) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:372) at com.databricks.logging.activity.ActivityContextFactory$.withAttributionContext(ActivityContextFactory.scala:28) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:420) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:405) at com.databricks.logging.activity.ActivityContextFactory$.withAttributionTags(ActivityContextFactory.scala:28) at com.databricks.logging.activity.ActivityContextFactory$.withActivity(ActivityContextFactory.scala:45) at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleHttp$2(JettyServer.scala:511) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:377) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:108) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:375) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:372) at com.databricks.rpc.JettyServer$.withAttributionContext(JettyServer.scala:258) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:420) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:405) at com.databricks.rpc.JettyServer$.withAttributionTags(JettyServer.scala:258) at com.databricks.rpc.JettyServer$RequestManager.handleHttp(JettyServer.scala:499) at com.databricks.rpc.JettyServer$RequestManager.doPost(JettyServer.scala:400) at javax.servlet.http.HttpServlet.service(HttpServlet.java:523) at com.databricks.rpc.HttpServletWithPatch.service(HttpServletWithPatch.scala:33) at javax.servlet.http.HttpServlet.service(HttpServlet.java:590) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at com.databricks.rpc.InstrumentedQueuedThreadPool$$anon$1.run(InstrumentedQueuedThreadPool.scala:80) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 22/12/13 08:53:47 INFO HikariDataSource: metastore-monitor - Shutdown completed. 22/12/13 08:53:47 WARN MetastoreMonitor: Failed to connect to the metastore InternalMysqlMetastore(DbMetastoreConfig{host=consolidated-westeuropec2-prod-metastore-3.mysql.database.azure.com, port=3306, dbName=organization5367007285973203, user=[REDACTED]}). (timeSinceLastSuccess=0) org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLTransientConnectionException: metastore-monitor - Connection is not available, request timed out after 15090ms. at org.skife.jdbi.v2.DBI.open(DBI.java:230) at com.databricks.backend.daemon.driver.MetastoreMonitor.$anonfun$checkMetastore$1(MetastoreMonitor.scala:195) at com.databricks.backend.daemon.driver.MetastoreMonitor.$anonfun$checkMetastore$1$adapted(MetastoreMonitor.scala:194) at com.databricks.common.database.DatabaseUtils$.withDBI(DatabaseUtils.scala:352) at com.databricks.backend.daemon.driver.MetastoreMonitor.checkMetastore(MetastoreMonitor.scala:194) at com.databricks.backend.daemon.driver.MetastoreMonitor.$anonfun$doMonitor$1(MetastoreMonitor.scala:158) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:515) at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:609) at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:630) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:377) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:108) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:375) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:372) at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:420) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:405) at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95) at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:604) at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:524) at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95) at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:515) at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:487) at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95) at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:377) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:108) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:375) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:372) at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95) at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1295) at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1294) at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95) at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109) at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: java.sql.SQLTransientConnectionException: metastore-monitor - Connection is not available, request timed out after 15090ms. at com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:696) at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:197) at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:162) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:128) at com.databricks.common.database.PassThroughHikariDataSource.getConnection(HikariDataSourceTrait.scala:52) at com.databricks.common.database.PassThroughSingleConnectionDataSource.getConnection(StandaloneDataSource.scala:67) at com.databricks.common.database.MultiDatabaseDataSource.getConnection(MultiDatabaseDataSource.scala:52) at org.skife.jdbi.v2.DataSourceConnectionFactory.openConnection(DataSourceConnectionFactory.java:36) at org.skife.jdbi.v2.DBI.open(DBI.java:212) ... 41 more 22/12/13 08:53:51 INFO DriverCorral: [Thread 134] AttachLibraries - candidate libraries: List(PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE)) 22/12/13 08:53:51 INFO DriverCorral: [Thread 134] AttachLibraries - new libraries to install (including resolved dependencies): List(PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE)) 22/12/13 08:53:52 INFO SharedDriverContext: [Thread 134] attachLibrariesToSpark PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE) 22/12/13 08:53:52 INFO SharedDriverContext: Attaching Python lib: dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl to clusterwide nfs path 22/12/13 08:53:52 INFO LibraryDownloadManager: Downloading a library that was not in the cache: PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE) 22/12/13 08:53:52 INFO LibraryDownloadManager: Attempt 1: wait until library PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE) is downloaded 22/12/13 08:53:52 INFO LibraryDownloadManager: Downloaded library PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE) as local file /local_disk0/tmp/addedFile8546073859456008787cicd_sample_project_0_0_1_py3_none_any-6773e.whl.whl in 769 milliseconds 22/12/13 08:53:52 INFO LibraryDownloadManager: Rename local file /local_disk0/tmp/addedFile8546073859456008787cicd_sample_project_0_0_1_py3_none_any-6773e.whl.whl to targetName: /local_disk0/tmp/cicd_sample_project-0.0.1-py3-none-any.whl 22/12/13 08:53:52 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib())) 22/12/13 08:53:52 INFO SharedDriverContext: Copied whl file from cicd_sample_project-0.0.1-py3-none-any.whl to cicd_sample_project-0.0.1-py3-none-any.whl 22/12/13 08:53:52 INFO Utils: resolved command to be run: List(bash, /local_disk0/.ephemeral_nfs/cluster_libraries/python/python_start_clusterwide.sh, /local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/pip, install, --upgrade, --find-links=/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages, /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/cicd_sample_project-0.0.1-py3-none-any.whl, --disable-pip-version-check) 22/12/13 08:53:57 INFO RDriverLocal$: SparkR installation completed. 22/12/13 08:53:57 INFO RDriverLocal: 5. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: launching R process ... 22/12/13 08:53:57 INFO RDriverLocal: 6. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: cgroup isolation disabled, not placing R process in REPL cgroup. 22/12/13 08:53:57 INFO RDriverLocal: 7. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: starting R process on port 1100 (attempt 1) ... 22/12/13 08:53:57 INFO RDriverLocal$: Debugging command for R process builder: SIMBASPARKINI=/etc/simba.sparkodbc.ini R_LIBS=/local_disk0/.ephemeral_nfs/envs/rEnv-b6546a3b-b4d0-46ba-947c-0ec2c9f26d75:/databricks/spark/R/lib:/local_disk0/.ephemeral_nfs/cluster_libraries/r LD_LIBRARY_PATH=/opt/simba/sparkodbc/lib/64/ SPARKR_BACKEND_CONNECTION_TIMEOUT=604800 DB_STREAM_BEACON_STRING_START=DATABRICKS_STREAM_START-ReplId-5c836-91590-09603-d DB_STDOUT_STREAM_PORT=33251 SPARKR_BACKEND_AUTH_SECRET=53c1bc39216e91f6a95dde9438107868d5e75058229c664cd0b7041e562bc4e6 DB_STREAM_BEACON_STRING_END=DATABRICKS_STREAM_END-ReplId-5c836-91590-09603-d EXISTING_SPARKR_BACKEND_PORT=45155 ODBCINI=/etc/odbc.ini DB_STDERR_STREAM_PORT=41623 /bin/bash /local_disk0/tmp/_startR.sh8583071710255904343resource.r /local_disk0/tmp/_rServeScript.r7751878714786184989resource.r 1100 None 22/12/13 08:53:57 INFO RDriverLocal: 8. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: setting up BufferedStreamThread with bufferSize: 1000. 22/12/13 08:53:58 INFO SharedDriverContext: Successfully attached library dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl to Spark 22/12/13 08:53:58 INFO LibraryState: [Thread 134] Successfully attached library dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl 22/12/13 08:53:58 INFO DriverCorral: [Thread 134] AttachLibraries - candidate libraries: List(PythonWhlId(dbfs:/dbx/ProcessingNotebooks/448b427ee3f84456912b04527b2227d1/artifacts/dist/cicd_sample_project-0.0.1-py3-none-any.whl,,NONE)) 22/12/13 08:53:58 INFO DriverCorral: [Thread 134] AttachLibraries - new libraries to install (including resolved dependencies): List() 22/12/13 08:53:59 INFO RDriverLocal: 9. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: R process started with RServe listening on port 1100. 22/12/13 08:53:59 INFO RDriverLocal: 10. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: starting interpreter to talk to R process ... 22/12/13 08:54:00 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 22/12/13 08:54:00 INFO ROutputStreamHandler: Successfully connected to stdout in the RShell. 22/12/13 08:54:00 INFO ROutputStreamHandler: Successfully connected to stderr in the RShell. 22/12/13 08:54:00 INFO RDriverLocal: 11. RDriverLocal.2a9fb8e7-362d-4f52-929c-607208818d7e: R interpreter is connected. 22/12/13 08:54:00 INFO RDriverWrapper: setupRepl:ReplId-5c836-91590-09603-d: finished to load 22/12/13 08:54:22 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:54:22 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:54:22 INFO DriverCorral: Starting python repl ReplId-61c9b-68060-de6de-a 22/12/13 08:54:22 INFO ClusterLoadMonitor: Added query with execution ID:0. Current active queries:1 22/12/13 08:54:22 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$ 22/12/13 08:54:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 0.0, New Ema: 1.0 22/12/13 08:54:24 INFO SecuredHiveExternalCatalog: creating hiveClient from java.lang.Throwable at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:79) at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:77) at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:113) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153) at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364) at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:313) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:261) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:251) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:60) at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$resourceLoader$1(HiveSessionStateBuilder.scala:67) at org.apache.spark.sql.hive.HiveSessionResourceLoader.client$lzycompute(HiveSessionStateBuilder.scala:165) at org.apache.spark.sql.hive.HiveSessionResourceLoader.client(HiveSessionStateBuilder.scala:165) at org.apache.spark.sql.hive.HiveSessionResourceLoader.$anonfun$addJar$1(HiveSessionStateBuilder.scala:169) at org.apache.spark.sql.hive.HiveSessionResourceLoader.$anonfun$addJar$1$adapted(HiveSessionStateBuilder.scala:168) at scala.collection.immutable.List.foreach(List.scala:431) at org.apache.spark.sql.hive.HiveSessionResourceLoader.addJar(HiveSessionStateBuilder.scala:168) at org.apache.spark.sql.execution.command.AddJarsCommand.$anonfun$run$1(resources.scala:38) at org.apache.spark.sql.execution.command.AddJarsCommand.$anonfun$run$1$adapted(resources.scala:36) at scala.collection.immutable.Stream.foreach(Stream.scala:533) at org.apache.spark.sql.execution.command.AddJarsCommand.run(resources.scala:36) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:241) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:241) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:390) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:187) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:241) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:226) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:239) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:232) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:232) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:232) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:186) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:177) at org.apache.spark.sql.Dataset.(Dataset.scala:237) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:106) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:103) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:820) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:815) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$5(DriverLocal.scala:291) at com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:1699) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$4(DriverLocal.scala:291) at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:41) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$3(DriverLocal.scala:285) at scala.util.Using$.resource(Using.scala:269) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$2(DriverLocal.scala:284) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at com.databricks.backend.daemon.driver.DriverLocal.(DriverLocal.scala:271) at com.databricks.backend.daemon.driver.PythonDriverLocalBase.(PythonDriverLocalBase.scala:169) at com.databricks.backend.daemon.driver.JupyterDriverLocal.(JupyterDriverLocal.scala:369) at com.databricks.backend.daemon.driver.PythonDriverWrapper.instantiateDriver(DriverWrapper.scala:712) at com.databricks.backend.daemon.driver.DriverWrapper.setupRepl(DriverWrapper.scala:342) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:231) at java.lang.Thread.run(Thread.java:750) 22/12/13 08:54:24 INFO HiveConf: Found configuration file file:/databricks/hive/conf/hive-site.xml 22/12/13 08:54:24 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead. 22/12/13 08:54:24 INFO HiveUtils: Initializing HiveMetastoreConnection version 0.13.0 using file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-lang--commons-lang--commons-lang__commons-lang__2.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--jetty--org.mortbay.jetty__jetty__6.1.26.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.objenesis--objenesis--org.objenesis__objenesis__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.commons--commons-lang3--org.apache.commons__commons-lang3__3.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.kryo--kryo--com.esotericsoftware.kryo__kryo__2.21.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-serde--org.apache.hive__hive-serde__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.avro--avro--org.apache.avro__avro__1.7.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.servlet--servlet-api--javax.servlet__servlet-api__2.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--antlr-runtime--org.antlr__antlr-runtime__3.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-metastore--org.apache.hive__hive-metastore__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-rdbms--org.datanucleus__datanucleus-rdbms__4.1.19.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.20--org.apache.hive.shims__hive-shims-0.20__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.reflectasm--reflectasm-shaded--com.esotericsoftware.reflectasm__reflectasm-shaded__1.07.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-cli--commons-cli--commons-cli__commons-cli__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.groovy--groovy-all--org.codehaus.groovy__groovy-all__2.1.6.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.httpcomponents--httpclient--org.apache.httpcomponents__httpclient__4.4.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-exec--org.apache.hive__hive-exec__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.derby--derby--org.apache.derby__derby__10.10.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.thoughtworks.paranamer--paranamer--com.thoughtworks.paranamer__paranamer__2.8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-core--org.datanucleus__datanucleus-core__4.1.17.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-beeline--org.apache.hive__hive-beeline__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-api--org.apache.logging.log4j__log4j-api__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-io--commons-io--commons-io__commons-io__2.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.zookeeper--zookeeper--org.apache.zookeeper__zookeeper__3.4.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.google.code.findbugs--jsr305--com.google.code.findbugs__jsr305__1.3.9.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.activation--activation--javax.activation__activation__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-jaspic_1.0_spec--org.apache.geronimo.specs__geronimo-jaspic_1.0_spec__1.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.commons--commons-compress--org.apache.commons__commons-compress__1.9.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--javax.jdo--org.datanucleus__javax.jdo__3.2.0-m3.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-1.2-api--org.apache.logging.log4j__log4j-1.2-api__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--stringtemplate--org.antlr__stringtemplate__3.2.1.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.slf4j--slf4j-api--org.slf4j__slf4j-api__1.7.36.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-ant--org.apache.hive__hive-ant__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-slf4j-impl--org.apache.logging.log4j__log4j-slf4j-impl__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-collections--commons-collections--commons-collections__commons-collections__3.2.2.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-core--org.apache.logging.log4j__log4j-core__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.xerial.snappy--snappy-java--org.xerial.snappy__snappy-java__1.0.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-httpclient--commons-httpclient--commons-httpclient__commons-httpclient__3.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-cli--org.apache.hive__hive-cli__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.mail--mail--javax.mail__mail__1.4.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--stax--stax-api--stax__stax-api__1.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--jline--jline--jline__jline__0.9.94.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.jolbox--bonecp--com.jolbox__bonecp__0.8.0.RELEASE.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-api-jdo--org.datanucleus__datanucleus-api-jdo__4.2.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-jta_1.1_spec--org.apache.geronimo.specs__geronimo-jta_1.1_spec__1.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.eclipse.jetty.aggregate--jetty-all--org.eclipse.jetty.aggregate__jetty-all__7.6.0.v20120127.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.minlog--minlog--com.esotericsoftware.minlog__minlog__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm-tree--asm__asm-tree__3.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-common-secure--org.apache.hive.shims__hive-shims-common-secure__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.thrift--libthrift--org.apache.thrift__libthrift__0.9.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm--asm__asm__3.1.jar:file:/databricks/databricks-hive/manifest.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-logging--commons-logging--commons-logging__commons-logging__1.1.3.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--oro--oro--oro__oro__2.0.8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-annotation_1.0_spec--org.apache.geronimo.specs__geronimo-annotation_1.0_spec__1.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.velocity--velocity--org.apache.velocity__velocity__1.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.google.guava--guava--com.google.guava__guava__11.0.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.jdo--jdo-api--javax.jdo__jdo-api__3.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.jackson--jackson-core-asl--org.codehaus.jackson__jackson-core-asl__1.9.13.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--junit--junit--junit__junit__3.8.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-common--org.apache.hive.shims__hive-shims-common__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--antlr--antlr--antlr__antlr__2.7.7.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.23--org.apache.hive.shims__hive-shims-0.23__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.ow2.asm--asm--org.ow2.asm__asm__4.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.20S--org.apache.hive.shims__hive-shims-0.20S__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--jetty-util--org.mortbay.jetty__jetty-util__6.1.26.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.zaxxer--HikariCP--com.zaxxer__HikariCP__2.5.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.thrift--libfb303--org.apache.thrift__libfb303__0.9.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--servlet-api--org.mortbay.jetty__servlet-api__2.5-20081211.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm-commons--asm__asm-commons__3.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-common--org.apache.hive__hive-common__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.transaction--jta--javax.transaction__jta__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-codec--commons-codec--commons-codec__commons-codec__1.8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.jackson--jackson-mapper-asl--org.codehaus.jackson__jackson-mapper-asl__1.9.13.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-jdbc--org.apache.hive__hive-jdbc__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-service--org.apache.hive__hive-service__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-shims--org.apache.hive__hive-shims__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.ant--ant--org.apache.ant__ant__1.9.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.httpcomponents--httpcore--org.apache.httpcomponents__httpcore__4.2.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--ST4--org.antlr__ST4__4.0.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--net.sf.jpam--jpam--net.sf.jpam__jpam__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.ant--ant-launcher--org.apache.ant__ant-launcher__1.9.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.transaction--transaction-api--javax.transaction__transaction-api__1.1.jar:file:/databricks/databricks-hive/bonecp-configs.jar 22/12/13 08:54:24 INFO PoolingHiveClient: Hive metastore connection pool implementation is HikariCP 22/12/13 08:54:24 INFO LocalHiveClientsPool: Create Hive Metastore client pool of size 1 22/12/13 08:54:24 INFO HiveClientImpl: Warehouse location for Hive client (version 0.13.1) is dbfs:/user/hive/warehouse 22/12/13 08:54:24 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 22/12/13 08:54:25 INFO ObjectStore: ObjectStore, initialize called 22/12/13 08:54:25 INFO Persistence: Property datanucleus.fixedDatastore unknown - will be ignored 22/12/13 08:54:25 INFO Persistence: Property datanucleus.connectionPool.idleTimeout unknown - will be ignored 22/12/13 08:54:25 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 22/12/13 08:54:25 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 22/12/13 08:54:25 INFO HikariDataSource: HikariPool-1 - Started. 22/12/13 08:54:26 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:29 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:32 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:35 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:47 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:50 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:53 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:54:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:26 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:29 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:32 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:35 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:47 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:50 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:53 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:55:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:26 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:29 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:32 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:35 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:47 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:50 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:53 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:56:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:26 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:29 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:32 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:35 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:47 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:50 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:53 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:57:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:26 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:29 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:32 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:34 INFO DriverCorral: DBFS health check ok 22/12/13 08:58:35 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:47 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:50 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:53 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:58:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 22/12/13 08:59:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0