GithubHelp home page GithubHelp logo

spark-monitoring's Introduction

Monitoring Azure Databricks in an Azure Log Analytics Workspace

This branch of the library supports Azure Databricks Runtimes 10.x (Spark 3.2.x) and earlier (see Supported configurations).
Databricks has contributed an updated version to support Azure Databricks Runtimes 11.0 (Spark 3.3.x) and above on the l4jv2 branch at: https://github.com/mspnp/spark-monitoring/tree/l4jv2.
Be sure to use the correct branch and version for your Databricks Runtime.
⚠️ This library and GitHub repository are in maintenance mode. There are no plans for further releases, and issue support will be best-effort only. For any additional questions regarding this library or the roadmap for monitoring and logging of your Azure Databricks environments, please contact [email protected].

This repository extends the core monitoring functionality of Azure Databricks to send streaming query event information to Azure Monitor. For more information about using this library to monitor Azure Databricks, see Monitoring Azure Databricks

The project has the following directory structure:

/src
    /spark-listeners-loganalytics
    /spark-listeners
    /pom.xml
/sample
    /spark-sample-job
/perftools
     /spark-sample-job

The spark-listeners-loganalytics and spark-listeners directories contain the code for building the two JAR files that are deployed to the Databricks cluster. The spark-listeners directory includes a scripts directory that contains a cluster node initialization script to copy the JAR files from a staging directory in the Azure Databricks file system to execution nodes. The pom.xml file is the main Maven project object model build file for the entire project.

The spark-sample-job directory is a sample Spark application demonstrating how to implement a Spark application metric counter.

The perftools directory contains details on how to use Azure Monitor with Grafana to monitor Spark performance.

Prerequisites

Before you begin, ensure you have the following prerequisites in place:

Supported configurations

Databricks Runtime(s) Maven Profile
7.3 LTS scala-2.12_spark-3.0.1
9.1 LTS scala-2.12_spark-3.1.2
10.3 - 10.5 scala-2.12_spark-3.2.1
11.0 See https://github.com/mspnp/spark-monitoring/tree/l4jv2

Logging Event Size Limit

This library currently has a size limit per event of 25MB, based on the Log Analytics limit of 30MB per API Call with additional overhead for formatting. The default behavior when hitting this limit is to throw an exception. This can be changed by modifying the value of EXCEPTION_ON_FAILED_SEND in GenericSendBuffer.java to false.

Note: You will see an error like: java.lang.RuntimeException: Failed to schedule batch because first message size nnn exceeds batch size limit 26214400 (bytes). in the Spark logs if your workload is generating logging messages of greater than 25MB, and your workload may not proceed. You can query Log Analytics for this error condition with:

SparkLoggingEvent_CL
| where TimeGenerated > ago(24h)
| where Message contains "java.lang.RuntimeException: Failed to schedule batch because first message size"

Build the Azure Databricks monitoring library

You can build the library using either Docker or Maven. All commands are intended to be run from the base directory of the repository.

The jar files that will be produced are:

spark-listeners_<Spark Version>_<Scala Version>-<Version>.jar - This is the generic implementation of the Spark Listener framework that provides capability for collecting data from the running cluster for forwarding to another logging system.

spark-listeners-loganalytics_<Spark Version>_<Scala Version>-<Version>.jar - This is the specific implementation that extends spark-listeners. This project provides the implementation for connecting to Log Analytics and formatting and passing data via the Log Analytics API.

Option 1: Docker

Linux:

# To build all profiles:
docker run -it --rm -v `pwd`:/spark-monitoring -v "$HOME/.m2":/root/.m2 mcr.microsoft.com/java/maven:8-zulu-debian10 /spark-monitoring/build.sh
# To build a single profile (example for latest long term support version 10.4 LTS):
docker run -it --rm -v `pwd`:/spark-monitoring -v "$HOME/.m2":/root/.m2 -w /spark-monitoring/src mcr.microsoft.com/java/maven:8-zulu-debian10 mvn install -P "scala-2.12_spark-3.2.1"

Windows:

# To build all profiles:
docker run -it --rm -v %cd%:/spark-monitoring -v "%USERPROFILE%/.m2":/root/.m2 mcr.microsoft.com/java/maven:8-zulu-debian10 /spark-monitoring/build.sh
# To build a single profile (example for latest long term support version 10.4 LTS):
docker run -it --rm -v %cd%:/spark-monitoring -v "%USERPROFILE%/.m2":/root/.m2 -w /spark-monitoring/src mcr.microsoft.com/java/maven:8-zulu-debian10 mvn install -P "scala-2.12_spark-3.2.1"

Option 2: Maven

  1. Import the Maven project project object model file, pom.xml, located in the /src folder into your project. This will import two projects:

    • spark-listeners
    • spark-listeners-loganalytics
  2. Activate a single Maven profile that corresponds to the versions of the Scala/Spark combination that is being used. By default, the Scala 2.12 and Spark 3.0.1 profile is active.

  3. Execute the Maven package phase in your Java IDE to build the JAR files for each of the these projects:

    Project JAR file
    spark-listeners spark-listeners_<Spark Version>_<Scala Version>-<Version>.jar
    spark-listeners-loganalytics spark-listeners-loganalytics_<Spark Version>_<Scala Version>-<Version>.jar

Configure the Databricks workspace

Copy the JAR files and init scripts to Databricks.

  1. Use the Azure Databricks CLI to create a directory named dbfs:/databricks/spark-monitoring:

    dbfs mkdirs dbfs:/databricks/spark-monitoring
  2. Open the /src/spark-listeners/scripts/spark-monitoring.sh script file and add your Log Analytics Workspace ID and Key to the lines below:

    export LOG_ANALYTICS_WORKSPACE_ID=
    export LOG_ANALYTICS_WORKSPACE_KEY=

If you do not want to add your Log Analytics workspace id and key into the init script in plaintext, you can also create an Azure Key Vault backed secret scope and reference those secrets through your cluster's environment variables.

  1. In order to add x-ms-AzureResourceId header as part of the http request, modify the following environment variables on /src/spark-listeners/scripts/spark-monitoring.sh. For instance:
export AZ_SUBSCRIPTION_ID=11111111-5c17-4032-ae54-fc33d56047c2
export AZ_RSRC_GRP_NAME=myAzResourceGroup
export AZ_RSRC_PROV_NAMESPACE=Microsoft.Databricks
export AZ_RSRC_TYPE=workspaces
export AZ_RSRC_NAME=myDatabricks

Now the _ResourceId /subscriptions/11111111-5c17-4032-ae54-fc33d56047c2/resourceGroups/myAzResourceGroup/providers/Microsoft.Databricks/workspaces/myDatabricks will be part of the header. (Note: If at least one of them is not set the header won't be included.)

  1. Use the Azure Databricks CLI to copy src/spark-listeners/scripts/spark-monitoring.sh to the directory created in step 3:

    dbfs cp src/spark-listeners/scripts/spark-monitoring.sh dbfs:/databricks/spark-monitoring/spark-monitoring.sh
  2. Use the Azure Databricks CLI to copy all of the jar files from the src/target folder to the directory created in step 3:

    dbfs cp --overwrite --recursive src/target/ dbfs:/databricks/spark-monitoring/

Create and configure the Azure Databricks cluster

  1. Navigate to your Azure Databricks workspace in the Azure Portal.
  2. Under "Compute", click "Create Cluster".
  3. Choose a name for your cluster and enter it in "Cluster name" text box.
  4. In the "Databricks Runtime Version" dropdown, select Runtime: 10.4 LTS (Scala 2.12, Spark 3.2.1).
  5. Under "Advanced Options", click on the "Init Scripts" tab. Go to the last line under the "Init Scripts section" Under the "destination" dropdown, select "DBFS". Enter "dbfs:/databricks/spark-monitoring/spark-monitoring.sh" in the text box. Click the "add" button.
  6. Click the "Create Cluster" button to create the cluster. Next, click on the "start" button to start the cluster.

Run the sample job (optional)

The repository includes a sample application that shows how to send application metrics and application logs to Azure Monitor.

When building the sample job, specify a maven profile compatible with your databricks runtime from the supported configurations section.

  1. Use Maven to build the POM located at sample/spark-sample-job/pom.xml or run the following Docker command:

    Linux:

    docker run -it --rm -v `pwd`/sample/spark-sample-job:/spark-sample-job -v "$HOME/.m2":/root/.m2 -w /spark-sample-job mcr.microsoft.com/java/maven:8-zulu-debian10 mvn install -P <maven-profile>

    Windows:

    docker run -it --rm -v %cd%/sample/spark-sample-job:/spark-sample-job -v "%USERPROFILE%/.m2":/root/.m2 -w /spark-sample-job mcr.microsoft.com/java/maven:8-zulu-debian10 mvn install -P <maven-profile>
  2. Navigate to your Databricks workspace and create a new job, as described here.

  3. In the job detail page, set Type to JAR.

  4. For Main class, enter com.microsoft.pnp.samplejob.StreamingQueryListenerSampleJob.

  5. Upload the JAR file from /src/spark-jobs/target/spark-jobs-1.0-SNAPSHOT.jar in the Dependent Libraries section.

  6. Select the cluster you created previously in the Cluster section.

  7. Select Create.

  8. Click the Run Now button to launch the job.

When the job runs, you can view the application logs and metrics in your Log Analytics workspace. After you verify the metrics appear, stop the sample application job.

Viewing the Sample Job's Logs in Log Analytics

After your sample job has run for a few minutes, you should be able to query for these event types in Log Analytics:

SparkListenerEvent_CL

This custom log will contain Spark events that are serialized to JSON. You can limit the volume of events in this log with filtering. If filtering is not employed, this can be a large volume of data.

Note: There is a known issue when the Spark framework or workload generates events that have more than 500 fields, or where data for an individual field is larger than 32kb. Log Analytics will generate an error indicating that data has been dropped. This is an incompatibility between the data being generated by Spark, and the current limitations of the Log Analytics API.

Example for querying SparkListenerEvent_CL for job throughput over the last 7 days:

let results=SparkListenerEvent_CL
| where TimeGenerated > ago(7d)
| where  Event_s == "SparkListenerJobStart"
| extend metricsns=column_ifexists("Properties_spark_metrics_namespace_s",Properties_spark_app_id_s)
| extend apptag=iif(isnotempty(metricsns),metricsns,Properties_spark_app_id_s)
| project Job_ID_d,apptag,Properties_spark_databricks_clusterUsageTags_clusterName_s,TimeGenerated
| order by TimeGenerated asc nulls last
| join kind= inner (
    SparkListenerEvent_CL
    | where Event_s == "SparkListenerJobEnd"
    | where Job_Result_Result_s == "JobSucceeded"
    | project Event_s,Job_ID_d,TimeGenerated
) on Job_ID_d;
results
| extend slice=strcat("#JobsCompleted ",Properties_spark_databricks_clusterUsageTags_clusterName_s,"-",apptag)
| summarize count() by bin(TimeGenerated, 1h),slice
| order by TimeGenerated asc nulls last

SparkLoggingEvent_CL

This custom log will contain data forwarded from Log4j (the standard logging system in Spark). The volume of logging can be controlled by altering the level of logging to forward or with filtering.

Example for querying SparkLoggingEvent_CL for logged errors over the last day:

SparkLoggingEvent_CL
| where TimeGenerated > ago(1d)
| where Level == "ERROR"

SparkMetric_CL

This custom log will contain metrics events as generated by the Spark framework or workload. You can adjust the time period or sources included by modifying the METRICS_PROPERTIES section of the spark-monitoring.sh script or by enabling filtering.

Example of querying SparkMetric_CL for the number of active executors per application over the last 7 days summarized every 15 minutes:

SparkMetric_CL
| where TimeGenerated > ago(7d)
| extend sname=split(name_s, ".")
| where sname[2] == "executor"
| extend executor=strcat(sname[1]) 
| extend app=strcat(sname[0])
| summarize NumExecutors=dcount(executor) by bin(TimeGenerated,  15m),app
| order by TimeGenerated asc nulls last

Note: For more details on how to use the saved search queries in logAnalyticsDeploy.json to understand and troubleshoot performance, see Observability patterns and metrics for performance tuning.

Filtering

The library is configurable to limit the volume of logs that are sent to each of the different Azure Monitor log types. See filtering for more details.

Debugging

If you encounter any issues with the init script, you can refer to the docs on debugging.

Contributing

See: CONTRIBUTING.md

spark-monitoring's People

Contributors

al-lac avatar algattik avatar atoakley avatar brillianttyagi avatar ckittel avatar dennislee-dennislee avatar dependabot[bot] avatar dfanesidb avatar esdhrgl1984 avatar hallihan avatar jocontr avatar juzuluag avatar lyytinen avatar matthiaspfenninger avatar mikanygtc avatar neil90 avatar nithinpnp avatar petertaylor9999 avatar rohitsharma-pnp avatar smarker avatar splinecl avatar veronicawasson avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spark-monitoring's Issues

How to use in python notebook

It is not quite clear in the docs how the get python notebooks log to Log Analytics.

I have read about the two options mentioned in #28

The reason it works in the sample is because we have configured log4j to log from our sample job package. To use this from a Databricks Notebook, you will need to do the same. There are a couple of options.

  • You can configure the whole cluster to log to Log Analytics, which will include notebooks
  • You can include the code below in every Databricks Notebook. It should work for any notebook because it pulls the class name from the notebook when it runs, but this is lightly tested, so YMMV.

Option A: is that simply the init script does this? Or do I need to configure something extra in order to have all notebooks send logs to log analytics?
Option B: the code is scala, but what about python based notebooks?

Hope you can clear out any confussion I have.

Can't build sample project

Readme says that I should be able to open the sample project's pom located at sample/spark-sample-job/pm.xml and it should be able to build using maven.

I don't have much experience with Java and Maven. I'm using IntelliJ IDEA. I'm able to build the other projects, but the sample one says:
Cannot resolve com.microsoft.pnp:spark-listeners_2.11_2.4.3:1.0.0

Looking in the Maven repository there isn't a package by that name so I'm assuming this should come from project in /src.

The sample pom.xml has this but I'm not sure how to tell it where to find that dependency

      <dependency>
            <groupId>com.microsoft.pnp</groupId>
            <artifactId>spark-listeners_${scala.compat.version}_${spark.version}</artifactId>
            <version>${spark.listeners.version}</version>
            <scope>provided</scope>
        </dependency>

Only forward limited log to Azure Log Analytics

I built the library and tried it out in our environment for a few days and the cluster ran only a few hours per day but I'm receiving huge amount of log in the Azure Log Analytics workspace. I assume it can be configured to forward only the chosen logs. Could anybody help me out?

Global Keyvault backed secrets

Is there a way to set global azure keyvault backed environment variables which can be used in init scripts?

Currently only setting them via the UI seems to work.

I tried adding the following to /databricks/spark/conf/spark-env.sh, but that does not seem to work:

LOG_ANALYTICS_WORKSPACE_ID={{secrets/<secret-scope-name>/<keyvault-key-name>}}
LOG_ANALYTICS_WORKSPACE_KEY={{secrets/<secret-scope-name>/<keyvault-key-name>}}

Documentation:
https://github.com/mspnp/spark-monitoring/blob/master/docs/keyvault-backed-secrets.md

@Smarker Since you wrote the documentation, would you have any advice for me?

Add support for Databricks Runtime 7.0 (Scala 2.12, Spark 3.0)

Please add support for the new Databricks Runtime 7.0 which is based on Scala 2.12 and Spark 3.0.

Adding a maven profile for it to pom.xml does not work because of too many changes in the new Spark and Scala versions:

<profile>
            <id>scala-2.12_spark-3.0.0</id>
            <activation>
                <activeByDefault>true</activeByDefault>
            </activation>
            <properties>
                <codahale.metrics.version>3.1.5</codahale.metrics.version>
                <commons-codec.version>1.10</commons-codec.version>
                <commons.httpclient.version>4.5.6</commons.httpclient.version>
                <fasterxml.jackson.databind.version>2.10.0</fasterxml.jackson.databind.version>
                <jetty.version>9.4.18.v20190429</jetty.version>
                <log4j.version>1.2.17</log4j.version>
                <slf4j.version>1.7.30</slf4j.version>
                <spark.version>3.0.0</spark.version>
                <scala.version>2.12.10</scala.version>
                <scala.compat.version>2.12</scala.compat.version>
            </properties>
        </profile>

How to send custom application logs to a separate Custom Log?

So the sample project is setup to send logs to three different custom logs: SparkListenerEvent_CL, SparkLoggingEvent_CL and SparkMetric_CL. These are mostly for internal spark logs and metrics.

What we need is to setup our own logger class that will write our custom application logs (by calling logInfo, logWarning, logError, etc. throughout our application).

We've extended the sample project to include the following class:

package com.microsoft.pnp.samplejob

import java.util

import com.microsoft.pnp.logging.{Log4jConfiguration, MDCCloseableFactory}
import com.microsoft.pnp.util.TryWith
import org.apache.spark.internal.Logging

import scala.util.Try

case class LogAnalyticsLogger(runId: String, sqlServer: String, storageRoot: String) extends Logging {
  // Configure our logging
  TryWith(getClass.getResourceAsStream("/com/microsoft/pnp/samplejob/log4j.properties")) {
    stream => {
      Log4jConfiguration.configure(stream)
    }
  }

  // Create a HashMap to add the custom properties that will be added to the event.
  val context = new util.HashMap[String, AnyRef]()
  context.put("runId", runId)
  context.put("sqlServer", sqlServer)
  context.put("storageRoot", storageRoot)

  // Create a MDC Factory that will add the properties to the event.
  // This class is not thread-safe so wrap it in a TryWith and don't make it live longer than needed.
  val mdcFactory: MDCCloseableFactory = new MDCCloseableFactory()
  TryWith(mdcFactory.create(context))(
    _ => {
      logInfo("Initializing logger with context")
      logWarning("Warning logger with context")
      logError("Error logger with context")
    }
  )

  def info(msg: String): Try[Unit] = {
    val mdcFactory: MDCCloseableFactory = new MDCCloseableFactory()
    TryWith(mdcFactory.create(context))(
      _ => {
        logInfo(msg)
      }
    )
  }

  def warning(msg: String): Try[Unit] = {
    val mdcFactory: MDCCloseableFactory = new MDCCloseableFactory()
    TryWith(mdcFactory.create(context))(
      _ => {
        logWarning(msg)
      }
    )
  }

  def error(msg: String, ex: Throwable): Try[Unit] = {
    val mdcFactory: MDCCloseableFactory = new MDCCloseableFactory()
    TryWith(mdcFactory.create(context))(
      _ => {
        logError(msg, ex)
      }
    )
  }
}

So far this works as expected, logging with our custom properties and all. But the logs are all mixed with the rest of the Spark internal logs in SparkLoggingEvent_CL. We want to have them in our own separate Custom Log group.

We're assuming we need to edit the log4j.properties file but we're not sure how and we can't find documentation about it.

provide an example on how to use this logging framework within a Databricks notebook

I am trying to use the logging framework with in the DBX notebook, however when i use the logger to write any type of log(warn,debug) it does not write to my loganalytics workspace. It does work using the sample job/jar you provided. If possible can you provide an example on how to use this logging framework within a Databricks notebook. Thank you!
image

spark continuously logging Error sending to Log Analytics

HI,

I have added the monitoring package for my Structured Steaming job.(Azure Databricks runtime 6.4 spark 2.4.5, Scala 2.11).
I can see the logs showing up in SparkLoggingEvent_CL and SparkListenerEvent_CL. But my spark job log in Databricks UI,
continuously log error

"Error sending to Log Analytics
java.io.IOException: Error sending to Log Analytics"
(attached image)
LogAnalyticError

Since i am using LogAnalytics to create any alert ,based on log query having any Error records. But I am not sure if some of the logs are not getting pushed to log analytics and my alert will not fire up if there are any error records which get dropped due to above error.

Thanks

logAnalyticsAppender impacting the performance of cluster after throwing an error

I am trying to send azure databricks custom logs to azure log analytics workspace using the steps given in github Documentation. The code i am using in databricks notebook is

import com.microsoft.pnp.util.TryWith
import com.microsoft.pnp.logging.Log4jConfiguration
import java.io.ByteArrayInputStream
import org.slf4j.LoggerFactory
import org.slf4j.Logger


val loggerName :String = "fromNotebook"
val level : String = "INFO" 
val logType: String = "testlogs"

val log4jConfig = s"""
log4j.appender.logAnalytics=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.logAnalytics.layout=com.microsoft.pnp.logging.JSONLayout
log4j.appender.logAnalytics.layout.LocationInfo=false
log4j.appender.logAnalytics.logType=$logType
log4j.additivity.$loggerName=false
log4j.logger.$loggerName=$level, logAnalytics
"""
TryWith(new ByteArrayInputStream(log4jConfig.getBytes())) {
  stream => {
    Log4jConfiguration.configure(stream)


    }
    }
val logger = LoggerFactory.getLogger(loggerName);
logger.info("logging info from " + loggerName)
 logger.warn("Warn message  " + loggerName)
logger.error("Error message  " + loggerName)

My

/home/ubuntu/databricks/spark/dbconf/log4j/executor/log4j.properties
for this appender looks like

log4j.rootCategory=INFO, console, logAnalyticsAppender
# logAnalytics
log4j.appender.logAnalyticsAppender=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.logAnalyticsAppender.filter.spark=com.microsoft.pnp.logging.SparkPropertyEnricher
#Disable all other logs
log4j.appender.logAnalyticsAppender.Threshold=INFO 

But it is showing weird behavior to me. It will work fine for level INFO but if I try to log anything below or above the level I declared in configuration it will throw below error. and after that no matter what changes I make in my code, it will only work after I restart my cluster. My cluster Performance is also getting impacted once it throw the error. Sometime This code even keep on running for indefinite period of time.

Error I am getting is:

> log4j:ERROR Error sending logging event to Log Analytics
> java.util.concurrent.RejectedExecutionException: Task
> com.microsoft.pnp.client.loganalytics.LogAnalyticsSendBufferTask@5b2a430
> rejected from
> java.util.concurrent.ThreadPoolExecutor@699636fd[Terminated, pool size
> = 0, active threads = 0, queued tasks = 0, completed tasks = 112] 	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
> 	at
> java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
> 	at
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
> 	at
> com.microsoft.pnp.client.GenericSendBuffer.send(GenericSendBuffer.java:88)
> 	at
> com.microsoft.pnp.client.loganalytics.LogAnalyticsSendBufferClient.sendMessage(LogAnalyticsSendBufferClient.java:43)
> 	at
> com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender.append(LogAnalyticsAppender.java:52)
> 	at
> org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
> 	at
> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
> 	at org.apache.log4j.Category.callAppenders(Category.java:206) 	at
> org.apache.log4j.Category.forcedLog(Category.java:391) 	at
> org.apache.log4j.Category.log(Category.java:856) 	at
> org.slf4j.impl.Log4jLoggerAdapter.info(Log4jLoggerAdapter.java:305)
> 	at log4jWrapper.MyLogger.info(MyLogger.scala:48) 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-764707897465587:10)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-764707897465587:63)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw.<init>(command-764707897465587:65)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw.<init>(command-764707897465587:67)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw.<init>(command-764707897465587:69)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw.<init>(command-764707897465587:71)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read.<init>(command-764707897465587:73)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$.<init>(command-764707897465587:77)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$read$.<clinit>(command-764707897465587)
> 	at
> line07d51ea7c1834afc957316967b0d0e8225.$eval$.$print$lzycompute(<notebook>:7)
> 	at line07d51ea7c1834afc957316967b0d0e8225.$eval$.$print(<notebook>:6)
> 	at line07d51ea7c1834afc957316967b0d0e8225.$eval.$print(<notebook>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498) 	at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
> 	at
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
> 	at
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
> 	at
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
> 	at
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> 	at
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
> 	at
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576) 	at
> scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572) 	at
> com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
> 	at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
> 	at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
> 	at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:685)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:638)
> 	at
> com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:373)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:350)
> 	at
> com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
> 	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) 	at
> com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48)
> 	at
> com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:271)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48)
> 	at
> com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:350)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
> 	at scala.util.Try$.apply(Try.scala:192) 	at
> com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
> 	at
> com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
> 	at java.lang.Thread.run(Thread.java:748)

Error while building spark-listeners

Here is my mvn version

Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: C:\Program Files\apache-maven-3.6.3\bin..
Java version: 1.8.0_251, vendor: Oracle Corporation, runtime: C:\Program Files\Java\jdk1.8.0_251\jre
Default locale: en_US, platform encoding: Cp1252
OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"

Error :
Error:(7, 24) java: cannot find symbol
symbol: class SparkInformation
location: package org.apache.spark
Error:(24, 40) java: cannot find symbol
symbol: variable SparkInformation
location: class com.microsoft.pnp.logging.SparkPropertyEnricher

Could you please suggest what am i missing

How to log only Error log from spark

Hi,
I don't have any custom application logging done, but i want only "Error" logs from spark to be written to Log analytics. Is there any option to configure this? Presently it is logging everything Info/Warning/Error to Log analytics.

Release publish in public repository

Hi

Do you plan to publish a release of this library in a public artifact repostiory like Maven Central in a near future? I want to use this lib in one of my projects, but it's very complicate for me to manage dependencies if the lib is not published in a public artifact repository.

Thank you

Log4j appender thread pool needs to be shutdown

The GenericSendBuffer class has an Executor that log publishing tasks are submitted to.

Any application that uses the library will hang for over a minute on completion waiting for this Executor to auto-shutdown

Therefore, the library should expose the ability to shutdown this Executor on completion of a job

At present the Executor it package private and therefore clients cannot access it to shut it down

Uncaught exceptions are sometimes not logged due to send buffer

The stack trace from uncaught exceptions in jobs sometimes (more often than not) never reach Log Analytics because of the 5000ms/25M log buffer implemented in the underlying layers of the appender.

The LogAnalyticsAppender log4j appender implementation doesn't expose the fact that it is buffering, so even when immediate flush is set to true (the default), the underlying classes buffer the log messages. When a DataBricks job is killed due to an uncaught exception, it appears the last few buffered messages (including stack trace) get lost along with it.

We've somewhat worked around this by adding LogManager.shutdown() to a try-catch block in the job logic, however this appears to have side effects we are still investigating -- my current hunch is that this call causes all other (e.g. cluster infra logs) to stop flowing as well after the job terminates.

Tangentially related - it looks like the buffering code was refactored at one point and LogAnalyticsSendBufferClient has references to the 5000ms/25M but those aren't used anywhere, and are in fact controlled by the LogAnalyticsSendBuffer instance.

support for scala 2.12

I implemented Grafana monitoring tool with getting metrics from Databricks through Azure Log Analytics and Azure Monitor as specified here: https://docs.microsoft.com/en-us/azure/architecture/databricks-monitoring/dashboards
This library is based on Scala 2.11 and Spark 2.4.3.

I guess my previous cluster where the monitoring solution was working was runtime 5.5. But as I updated runtime of my cluster to the latest version, based on Scala 2.12, the monitoring tool does not work. More specifically, it looks like that there is a problem on the init bash (spark-monitoring.sh).

I wonder if you guys have a plan to update the spark-monitoring pacakge according to new version of Databricks runtime.

Unable to build jar for spark-listeners-loganalytics

Hi

I am unable to build the second jar (for spark-listeners-loganalytics).

When I do maven build, build completes successfully but when I create jar using Maven install, I get dependency issue on com.microsoft.pnp.

Error that I get is

Failed to get dependencies for com.microsoft.pnp: spark-listeners_2.11_2.4.4: Failed to read artifact descriptor

Tried to find some solutions over internet but no luck so far.

Any help would be greatly appreciated.

Thanks

'SparkListenerEvent_CL' not recognized

Guys, in Log analytics, all the requests that leverage SparkListenerEvent_CL give me the following error message, whereas the ones that rely on SparkMetric_CL work well.

'where' operator: Failed to resolve table or column expression named 'SparkListenerEvent_CL'

I'm running DB 6.4 (includes Apache Spark 2.4.5, Scala 2.11)
I had to tweak the pom.xml and build.sh to compile the libs for spark 2.4.5

The lib are properly uploaded to DB and the init script ran well.
Can you help me?

Grafana Dashboard error in Job and Stage Latency panels

I get the following error on the Job and Stage Latency Panels when i deploy the grafana dashboard generated by the script:

'extend' operator: Failed to resolve scalar expression named 'Properties_spark_metrics_namespace_s'

Putting "Properties_spark_metrics_namespace_s" in both "D" queries in quotes seems to solve the issue.

How to use this in a SBT project?

How can I make use of this in a project that uses SBT instead of Maven? The sample shows how to do it on Maven but I can't seem to get it working on my SBT project.

Thanks

Running multiple jobs on same cluster throwing error

I am trying to run multiple jobs with different log type, but its throwing an error

Task com.microsoft.pnp.client.loganalytics.LogAnalyticsSendBufferTask@750db652 rejected from java.util.concurrent.ThreadPoolExecutor@6d69d47c[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 51]

`
def getLogger(queueName: String) : Logger = {

val className = getClass.getName
val log4jConfig =
            s"""
      log4j.appender.logAnalytics=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
      log4j.appender.logAnalytics.layout=com.microsoft.pnp.logging.JSONLayout
      log4j.appender.logAnalytics.layout.LocationInfo=false
      log4j.appender.logAnalytics.logType=Test_${queueName.replaceAll("-","_")}
      log4j.additivity.$className=false
      log4j.logger.$className=INFO, logAnalytics
      """

`

spark-monitoring Library is not working for the 6.4 Databricks Runtime Version

This is a followup on #68 that fixed support for databricks 6.0-6.3, but does not cover 6.4 runtime.

The runtime uses Apache Spark version 2.4.5 as stated in the docs. The PR#69 added maven build profile for 6.0-6.3 and updated cluster init script to load required .jar files dynamically based on Databricks runtime versions.

This results in the script trying to load a jar file containing scala-2.11_spark-2.4.5 in the file name.

The quickfix would be to add a new maven build profile that creates the proper jar file with right dependency to spark version 2.4.5.

I will try to submit a PR for the quickfix, but I'm guessing this might be an issue in the future for any updates to the Databricks runtime, e.g 6.5 and 7.0.

How to log from a notebook with custom properties

Hi to all,

Actually we are using log4j to log to errors but our custom logs are mixed with spark logs .

for resolve this one we used the below code
log4j.appender.logAnalytics.logType=CUSTOM_LOG_NAME

still we getting like SparkLoggingEvent_CL

we are not not understanding how to do this.

Could any one do help on this
Any help would be appreciated

Thankyou

Information:java: Errors occurred while compiling module 'spark-listeners'

We are having this issue. Please help
In https://github.com/mspnp/spark-monitoring/blob/master/README.md#build-the-azure-databricks-monitoring-library
2.Execute the Maven package build phase
We are getting error.Here is the error mesage we are seeing whencreating Jar files from the downloaded project in Github:

Information:java: Errors occurred while compiling module 'spark-listeners'
Information:javac 11.0.4 was used to compile java sources
Information:10/8/2019 7:21 AM - Build completed with 2 errors and 6 warnings in 27 s 90 ms
C:\Users\sandeep_avuthu\Downloads\spark-monitoring-master\spark-monitoring-master\src\spark-listeners\src\main\java\com\microsoft\pnp\logging\SparkPropertyEnricher.java
Error:(7, 24) java: cannot find symbol
symbol: class SparkInformation
location: package org.apache.spark
Error:(24, 40) java: cannot find symbol
symbol: variable SparkInformation
location: class com.microsoft.pnp.logging.SparkPropertyEnricher
C:\Users\sandeep_avuthu\Downloads\spark-monitoring-master\spark-monitoring-master\src\spark-listeners\src\main\java\com\microsoft\pnp\client\GenericSendBuffer.java
C:\Users\sandeep_avuthu\Downloads\spark-monitoring-master\spark-monitoring-master\src\spark-listeners\src\main\java\com\microsoft\pnp\logging\JSONLayout.java

Spark-Listener dependency issue

Hi project team,

we tried to compile a jar file, but there is an issue with the dependency.
We are using the following comand to resolve the dependencies:
mvn dependency:resolve

.. and recieve the following error message:

[ERROR] Failed to execute goal on project spark-jobs: Could not resolve dependencies for project com.microsoft.pnp:spark-jobs:jar:1.0-SNAPSHOT: Could not find artifact com.mi
crosoft.pnp:spark-listeners:jar:1.0-SNAPSHOT -> [Help 1]

can not build master branch

I tried to build the most recent branch with mvn and got an error.

mvn --version
Maven home: C:\Program Files (x86)\apache-maven-3.6.1\bin..
Java version: 12.0.1, vendor: Oracle Corporation, runtime: C:\Program Files\Java\jdk-12.0.1
Default locale: en_US, platform encoding: Cp1252
OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"

scala -version
Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL

mvn clean install -e -X

long stack trace

Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-04T21:00:29+02:00)
Maven home: C:\Program Files (x86)\apache-maven-3.6.1\bin..
Java version: 12.0.1, vendor: Oracle Corporation, runtime: C:\Program Files\Java\jdk-12.0.1
Default locale: en_US, platform encoding: Cp1252
OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"
[DEBUG] Created new class realm maven.api
[DEBUG] Importing foreign packages into class realm maven.api
[DEBUG] Imported: javax.annotation.* < plexus.core
[DEBUG] Imported: javax.annotation.security.* < plexus.core
[DEBUG] Imported: javax.enterprise.inject.* < plexus.core
[DEBUG] Imported: javax.enterprise.util.* < plexus.core
[DEBUG] Imported: javax.inject.* < plexus.core
[DEBUG] Imported: org.apache.maven.* < plexus.core
[DEBUG] Imported: org.apache.maven.artifact < plexus.core
[DEBUG] Imported: org.apache.maven.classrealm < plexus.core
[DEBUG] Imported: org.apache.maven.cli < plexus.core
[DEBUG] Imported: org.apache.maven.configuration < plexus.core
[DEBUG] Imported: org.apache.maven.exception < plexus.core
[DEBUG] Imported: org.apache.maven.execution < plexus.core
[DEBUG] Imported: org.apache.maven.execution.scope < plexus.core
[DEBUG] Imported: org.apache.maven.lifecycle < plexus.core
[DEBUG] Imported: org.apache.maven.model < plexus.core
[DEBUG] Imported: org.apache.maven.monitor < plexus.core
[DEBUG] Imported: org.apache.maven.plugin < plexus.core
[DEBUG] Imported: org.apache.maven.profiles < plexus.core
[DEBUG] Imported: org.apache.maven.project < plexus.core
[DEBUG] Imported: org.apache.maven.reporting < plexus.core
[DEBUG] Imported: org.apache.maven.repository < plexus.core
[DEBUG] Imported: org.apache.maven.rtinfo < plexus.core
[DEBUG] Imported: org.apache.maven.settings < plexus.core
[DEBUG] Imported: org.apache.maven.toolchain < plexus.core
[DEBUG] Imported: org.apache.maven.usability < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.* < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.authentication < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.authorization < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.events < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.observers < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.proxy < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.repository < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.resource < plexus.core
[DEBUG] Imported: org.codehaus.classworlds < plexus.core
[DEBUG] Imported: org.codehaus.plexus.* < plexus.core
[DEBUG] Imported: org.codehaus.plexus.classworlds < plexus.core
[DEBUG] Imported: org.codehaus.plexus.component < plexus.core
[DEBUG] Imported: org.codehaus.plexus.configuration < plexus.core
[DEBUG] Imported: org.codehaus.plexus.container < plexus.core
[DEBUG] Imported: org.codehaus.plexus.context < plexus.core
[DEBUG] Imported: org.codehaus.plexus.lifecycle < plexus.core
[DEBUG] Imported: org.codehaus.plexus.logging < plexus.core
[DEBUG] Imported: org.codehaus.plexus.personality < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.Xpp3Dom < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlPullParser < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlPullParserException < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlSerializer < plexus.core
[DEBUG] Imported: org.eclipse.aether.* < plexus.core
[DEBUG] Imported: org.eclipse.aether.artifact < plexus.core
[DEBUG] Imported: org.eclipse.aether.collection < plexus.core
[DEBUG] Imported: org.eclipse.aether.deployment < plexus.core
[DEBUG] Imported: org.eclipse.aether.graph < plexus.core
[DEBUG] Imported: org.eclipse.aether.impl < plexus.core
[DEBUG] Imported: org.eclipse.aether.installation < plexus.core
[DEBUG] Imported: org.eclipse.aether.internal.impl < plexus.core
[DEBUG] Imported: org.eclipse.aether.metadata < plexus.core
[DEBUG] Imported: org.eclipse.aether.repository < plexus.core
[DEBUG] Imported: org.eclipse.aether.resolution < plexus.core
[DEBUG] Imported: org.eclipse.aether.spi < plexus.core
[DEBUG] Imported: org.eclipse.aether.transfer < plexus.core
[DEBUG] Imported: org.eclipse.aether.version < plexus.core
[DEBUG] Imported: org.fusesource.jansi.* < plexus.core
[DEBUG] Imported: org.slf4j.* < plexus.core
[DEBUG] Imported: org.slf4j.event.* < plexus.core
[DEBUG] Imported: org.slf4j.helpers.* < plexus.core
[DEBUG] Imported: org.slf4j.spi.* < plexus.core
[DEBUG] Populating class realm maven.api
[INFO] Error stacktraces are turned on.
[DEBUG] Message scheme: color
[DEBUG] Message styles: debug info warning error success failure strong mojo project
[DEBUG] Reading global settings from C:\Program Files (x86)\apache-maven-3.6.1\bin..\conf\settings.xml
[DEBUG] Reading user settings from C:\Users\KOF4BE.m2\settings.xml
[DEBUG] Reading global toolchains from C:\Program Files (x86)\apache-maven-3.6.1\bin..\conf\toolchains.xml
[DEBUG] Reading user toolchains from C:\Users\KOF4BE.m2\toolchains.xml
[DEBUG] Using local repository at C:\Users\KOF4BE.m2\repository
[DEBUG] Using manager EnhancedLocalRepositoryManager with priority 10.0 for C:\Users\KOF4BE.m2\repository
[INFO] Scanning for projects...
[DEBUG] Extension realms for project com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecycle mappings for packaging pom from ClassRealm[plexus.core, parent: null]
[DEBUG] Extension realms for project com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecycle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[DEBUG] Extension realms for project com.microsoft.pnp:spark-jobs:jar:1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecycle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[DEBUG] Extension realms for project com.microsoft.pnp:spark-listeners-loganalytics:jar:1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecycle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] spark-monitoring [pom]
[INFO] spark-listeners [jar]
[INFO] spark-jobs [jar]
[INFO] spark-listeners-loganalytics [jar]
[DEBUG] === REACTOR BUILD PLAN ================================================
[DEBUG] Project: com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT
[DEBUG] Tasks: [clean, install]
[DEBUG] Style: Regular
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Project: com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT
[DEBUG] Tasks: [clean, install]
[DEBUG] Style: Regular
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Project: com.microsoft.pnp:spark-jobs:jar:1.0-SNAPSHOT
[DEBUG] Tasks: [clean, install]
[DEBUG] Style: Regular
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Project: com.microsoft.pnp:spark-listeners-loganalytics:jar:1.0-SNAPSHOT
[DEBUG] Tasks: [clean, install]
[DEBUG] Style: Regular
[DEBUG] =======================================================================
[INFO]
[INFO] -----------------< com.microsoft.pnp:spark-monitoring >-----------------
[INFO] Building spark-monitoring 1.0-SNAPSHOT [1/4]
[INFO] --------------------------------[ pom ]---------------------------------
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] === PROJECT BUILD PLAN ================================================
[DEBUG] Project: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT
[DEBUG] Dependencies (collect): []
[DEBUG] Dependencies (resolve): [compile, test]
[DEBUG] Repositories (dependencies): [bci-mvn (https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual, default, releases+snapshots), central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] Repositories (plugins) : [bci-mvn (https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual, default, releases+snapshots), central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean (default-clean)
[DEBUG] Style: Regular
[DEBUG] Configuration:


${maven.clean.excludeDefaultDirectories}
${maven.clean.failOnError}
${maven.clean.followSymLinks}


${maven.clean.retryOnError}
${maven.clean.skip}

${maven.clean.verbose}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:add-source (default)
[DEBUG] Style: Regular
[DEBUG] Configuration:

${project}


${maven.scala.useCanonicalPath}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean (auto-clean)
[DEBUG] Style: Regular
[DEBUG] Configuration:


${maven.clean.excludeDefaultDirectories}
${maven.clean.failOnError}
${maven.clean.followSymLinks}


${maven.clean.retryOnError}
${maven.clean.skip}

${maven.clean.verbose}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:compile (default)
[DEBUG] Style: Regular
[DEBUG] Configuration:

${addJavacArgs}
${addScalacArgs}
${addZincArgs}
${analysisCacheFile}

-target:jvm-1.8
-feature
-dependencyfile
C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target/.scala_dependencies

${maven.scala.checkConsistency}
${compileOrder}
${displayCmd}
${project.build.sourceEncoding}




-source
1.8
-target
1.8${javacArgs}
${javacGenerateDebugSymbols}
${localRepository}
${localRepository}
${notifyCompilation}
${project.build.outputDirectory}

${project}

incremental
${project.remoteArtifactRepositories}
${maven.scala.className}
2.11
${scala.home}
${scala.organization}
2.11.8

${session}

${maven.compiler.source} ${maven.compiler.target} ${maven.scala.useCanonicalPath} ${useZincServer} ${zincHost} ${zincPort} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:testCompile (default) [DEBUG] Style: Regular [DEBUG] Configuration: ${addJavacArgs} ${addScalacArgs} ${addZincArgs} -target:jvm-1.8 -feature -dependencyfile C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target/.scala_dependencies ${maven.scala.checkConsistency} ${compileOrder} ${displayCmd} ${project.build.sourceEncoding} -source 1.8 -target 1.8${javacArgs} ${javacGenerateDebugSymbols} ${localRepository} ${localRepository} ${notifyCompilation} ${project} incremental ${project.remoteArtifactRepositories} ${maven.scala.className} 2.11 ${scala.home} ${scala.organization} 2.11.8 ${session} ${maven.test.skip} ${maven.compiler.source} ${maven.compiler.target} ${testAnalysisCacheFile} ${maven.scala.useCanonicalPath} ${useZincServer} ${zincHost} ${zincPort} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.scalatest:scalatest-maven-plugin:2.0.0:test (test) [DEBUG] Style: Regular [DEBUG] Configuration: ${argLine} ${config} ${debugArgLine} ${debugForkedProcess} ${debuggerPort} TestSuiteReport.txt never ${timeout} ${htmlreporters} ${junitClasses} . ${logForkedProcessCommand} ${membersOnlySuites} ${memoryFiles} ${project.build.outputDirectory} false ${reporters} C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target/surefire-reports ${runpath} ${skipTests} ${spanScaleFactor} ${stderr} ${stdout} ${suffixes} ${suites} file:src/test/resources/log4j.properties ${tagsToExclude} ${tagsToInclude} ${maven.test.failure.ignore} ${testNGXMLFiles} ${project.build.testOutputDirectory} ${tests} ${testsFiles} ${wildcardSuites} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-install-plugin:2.4:install (default-install) [DEBUG] Style: Regular [DEBUG] Configuration: ${createChecksum} ${localRepository} ${maven.install.skip} ${updateReleaseInfo} [DEBUG] ======================================================================= [DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=502818, ConflictMarker.markTime=90194, ConflictMarker.nodeCount=16, ConflictIdSorter.graphTime=281383, ConflictIdSorter.topsortTime=247899, ConflictIdSorter.conflictIdCount=9, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=2094986, ConflictResolver.conflictItemCount=15, DefaultDependencyCollector.collectTime=84951425, DefaultDependencyCollector.transformTime=4712232} [DEBUG] com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT [DEBUG] org.scala-lang:scala-library:jar:2.11.8:provided [DEBUG] org.slf4j:slf4j-api:jar:1.7.7:provided [DEBUG] junit:junit:jar:4.12:test [DEBUG] org.hamcrest:hamcrest-core:jar:1.3:test [DEBUG] org.scalatest:scalatest_2.11:jar:3.0.3:test [DEBUG] org.scalactic:scalactic_2.11:jar:3.0.3:test [DEBUG] org.scala-lang:scala-reflect:jar:2.11.8:test [DEBUG] org.scala-lang.modules:scala-xml_2.11:jar:1.0.5:test [DEBUG] org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:test [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ spark-monitoring --- [DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=56709, ConflictMarker.markTime=51848, ConflictMarker.nodeCount=14, ConflictIdSorter.graphTime=12962, ConflictIdSorter.topsortTime=11882, ConflictIdSorter.conflictIdCount=12, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=345113, ConflictResolver.conflictItemCount=14, DefaultDependencyCollector.collectTime=187199038, DefaultDependencyCollector.transformTime=526042} [DEBUG] org.apache.maven.plugins:maven-clean-plugin:jar:3.1.0: [DEBUG] org.apache.maven:maven-plugin-api:jar:3.0:compile [DEBUG] org.apache.maven:maven-model:jar:3.0:compile [DEBUG] org.codehaus.plexus:plexus-utils:jar:2.0.4:compile [DEBUG] org.apache.maven:maven-artifact:jar:3.0:compile [DEBUG] org.sonatype.sisu:sisu-inject-plexus:jar:1.4.2:compile [DEBUG] org.codehaus.plexus:plexus-component-annotations:jar:1.7.1:compile [DEBUG] org.codehaus.plexus:plexus-classworlds:jar:2.2.3:compile [DEBUG] org.sonatype.sisu:sisu-inject-bean:jar:1.4.2:compile [DEBUG] org.sonatype.sisu:sisu-guice:jar:noaop:2.1.7:compile [DEBUG] org.apache.maven.shared:maven-shared-utils:jar:3.2.1:compile [DEBUG] commons-io:commons-io:jar:2.5:compile [DEBUG] Created new class realm plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0 [DEBUG] Importing foreign packages into class realm plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0 [DEBUG] Imported: < maven.api [DEBUG] Populating class realm plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0 [DEBUG] Included: org.apache.maven.plugins:maven-clean-plugin:jar:3.1.0 [DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:2.0.4 [DEBUG] Included: org.codehaus.plexus:plexus-component-annotations:jar:1.7.1 [DEBUG] Included: org.sonatype.sisu:sisu-inject-bean:jar:1.4.2 [DEBUG] Included: org.sonatype.sisu:sisu-guice:jar:noaop:2.1.7 [DEBUG] Included: org.apache.maven.shared:maven-shared-utils:jar:3.2.1 [DEBUG] Included: commons-io:commons-io:jar:2.5 [DEBUG] Configuring mojo org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean' with basic configurator --> [DEBUG] (f) directory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target [DEBUG] (f) excludeDefaultDirectories = false [DEBUG] (f) failOnError = true [DEBUG] (f) followSymLinks = false [DEBUG] (f) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] (f) reportDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] (f) retryOnError = true [DEBUG] (f) skip = false [DEBUG] (f) testOutputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes [DEBUG] -- end configuration -- [INFO] Deleting C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes [INFO] Deleting file C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports\TestSuiteReport.txt [INFO] Deleting file C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports\TEST-org.scalatest.tools.DiscoverySuite-8d1309e3-73ef-4f76-8bc7-d3775f88f842.xml [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [INFO] [INFO] --- scala-maven-plugin:3.4.2:add-source (default) @ spark-monitoring --- [DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=701569, ConflictMarker.markTime=387240, ConflictMarker.nodeCount=245, ConflictIdSorter.graphTime=171206, ConflictIdSorter.topsortTime=54549, ConflictIdSorter.conflictIdCount=78, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=4946629, ConflictResolver.conflictItemCount=167, DefaultDependencyCollector.collectTime=1173606205, DefaultDependencyCollector.transformTime=6288738} [DEBUG] net.alchim31.maven:scala-maven-plugin:jar:3.4.2: [DEBUG] org.apache.maven:maven-compat:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-model-builder:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-settings:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-settings-builder:jar:3.5.4:compile [DEBUG] org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4:compile [DEBUG] org.sonatype.plexus:plexus-cipher:jar:1.4:compile [DEBUG] org.apache.maven:maven-artifact:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-resolver-provider:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-repository-metadata:jar:3.5.4:compile [DEBUG] org.apache.maven.resolver:maven-resolver-api:jar:1.1.1:compile [DEBUG] org.apache.maven.resolver:maven-resolver-util:jar:1.1.1:compile [DEBUG] org.apache.maven.resolver:maven-resolver-impl:jar:1.1.1:compile [DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.24:compile [DEBUG] org.eclipse.sisu:org.eclipse.sisu.plexus:jar:0.3.3:compile [DEBUG] javax.enterprise:cdi-api:jar:1.0:compile [DEBUG] javax.annotation:jsr250-api:jar:1.0:compile [DEBUG] org.codehaus.plexus:plexus-component-annotations:jar:1.7.1:compile [DEBUG] org.apache.maven.wagon:wagon-provider-api:jar:3.1.0:compile [DEBUG] org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile [DEBUG] org.apache.maven:maven-core:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-builder-support:jar:3.5.4:compile [DEBUG] org.apache.maven:maven-plugin-api:jar:3.5.4:compile [DEBUG] org.apache.maven.resolver:maven-resolver-spi:jar:1.1.1:compile [DEBUG] org.apache.maven.shared:maven-shared-utils:jar:3.2.1:compile [DEBUG] org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.3:compile [DEBUG] com.google.inject:guice:jar:no_aop:4.2.0:compile [DEBUG] aopalliance:aopalliance:jar:1.0:compile [DEBUG] com.google.guava:guava:jar:20.0:compile [DEBUG] javax.inject:javax.inject:jar:1:compile [DEBUG] org.apache.commons:commons-lang3:jar:3.5:compile [DEBUG] org.apache.maven.shared:maven-dependency-tree:jar:3.0.1:compile [DEBUG] org.eclipse.aether:aether-util:jar:0.9.0.M2:compile [DEBUG] org.apache.commons:commons-exec:jar:1.3:compile [DEBUG] org.codehaus.plexus:plexus-utils:jar:3.1.0:compile [DEBUG] org.codehaus.plexus:plexus-archiver:jar:3.6.0:compile [DEBUG] org.codehaus.plexus:plexus-io:jar:3.0.1:compile [DEBUG] org.apache.commons:commons-compress:jar:1.16.1:compile [DEBUG] org.objenesis:objenesis:jar:2.6:compile [DEBUG] org.iq80.snappy:snappy:jar:0.4:compile [DEBUG] org.tukaani:xz:jar:1.8:runtime [DEBUG] org.codehaus.plexus:plexus-classworlds:jar:2.5.2:compile [DEBUG] org.apache.maven:maven-project:jar:2.2.1:compile [DEBUG] org.apache.maven:maven-profile:jar:2.2.1:compile [DEBUG] org.apache.maven:maven-artifact-manager:jar:2.2.1:compile [DEBUG] backport-util-concurrent:backport-util-concurrent:jar:3.1:compile [DEBUG] org.apache.maven:maven-plugin-registry:jar:2.2.1:compile [DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile [DEBUG] junit:junit:jar:3.8.1:compile [DEBUG] classworlds:classworlds:jar:1.1-alpha-2:compile [DEBUG] org.apache.maven:maven-archiver:jar:3.2.0:compile [DEBUG] commons-io:commons-io:jar:2.5:compile [DEBUG] org.apache.maven.doxia:doxia-sink-api:jar:1.8:compile [DEBUG] org.apache.maven.doxia:doxia-logging-api:jar:1.8:compile [DEBUG] org.apache.maven:maven-model:jar:3.5.4:compile [DEBUG] org.apache.maven.shared:maven-invoker:jar:3.0.1:compile [DEBUG] com.typesafe.zinc:zinc:jar:0.3.15:compile [DEBUG] org.scala-lang:scala-library:jar:2.10.6:compile [DEBUG] com.typesafe.sbt:incremental-compiler:jar:0.13.15:compile [DEBUG] org.scala-lang:scala-compiler:jar:2.10.6:compile [DEBUG] org.scala-lang:scala-reflect:jar:2.10.6:compile [DEBUG] com.typesafe.sbt:sbt-interface:jar:0.13.15:compile [DEBUG] com.typesafe.sbt:compiler-interface:jar:sources:0.13.15:compile [DEBUG] Created new class realm plugin>net.alchim31.maven:scala-maven-plugin:3.4.2 [DEBUG] Importing foreign packages into class realm plugin>net.alchim31.maven:scala-maven-plugin:3.4.2 [DEBUG] Imported: < maven.api [DEBUG] Populating class realm plugin>net.alchim31.maven:scala-maven-plugin:3.4.2 [DEBUG] Included: net.alchim31.maven:scala-maven-plugin:jar:3.4.2 [DEBUG] Included: org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4 [DEBUG] Included: org.sonatype.plexus:plexus-cipher:jar:1.4 [DEBUG] Included: org.apache.maven.resolver:maven-resolver-util:jar:1.1.1 [DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.24 [DEBUG] Included: javax.enterprise:cdi-api:jar:1.0 [DEBUG] Included: org.codehaus.plexus:plexus-component-annotations:jar:1.7.1 [DEBUG] Included: org.apache.maven.reporting:maven-reporting-api:jar:3.0 [DEBUG] Included: org.apache.maven:maven-builder-support:jar:3.5.4 [DEBUG] Included: org.apache.maven.shared:maven-shared-utils:jar:3.2.1 [DEBUG] Included: org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.3 [DEBUG] Included: com.google.inject:guice:jar:no_aop:4.2.0 [DEBUG] Included: aopalliance:aopalliance:jar:1.0 [DEBUG] Included: com.google.guava:guava:jar:20.0 [DEBUG] Included: org.apache.commons:commons-lang3:jar:3.5 [DEBUG] Included: org.apache.maven.shared:maven-dependency-tree:jar:3.0.1 [DEBUG] Included: org.eclipse.aether:aether-util:jar:0.9.0.M2 [DEBUG] Included: org.apache.commons:commons-exec:jar:1.3 [DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.1.0 [DEBUG] Included: org.codehaus.plexus:plexus-archiver:jar:3.6.0 [DEBUG] Included: org.codehaus.plexus:plexus-io:jar:3.0.1 [DEBUG] Included: org.apache.commons:commons-compress:jar:1.16.1 [DEBUG] Included: org.objenesis:objenesis:jar:2.6 [DEBUG] Included: org.iq80.snappy:snappy:jar:0.4 [DEBUG] Included: org.tukaani:xz:jar:1.8 [DEBUG] Included: backport-util-concurrent:backport-util-concurrent:jar:3.1 [DEBUG] Included: junit:junit:jar:3.8.1 [DEBUG] Included: org.apache.maven:maven-archiver:jar:3.2.0 [DEBUG] Included: commons-io:commons-io:jar:2.5 [DEBUG] Included: org.apache.maven.doxia:doxia-sink-api:jar:1.8 [DEBUG] Included: org.apache.maven.doxia:doxia-logging-api:jar:1.8 [DEBUG] Included: org.apache.maven.shared:maven-invoker:jar:3.0.1 [DEBUG] Included: com.typesafe.zinc:zinc:jar:0.3.15 [DEBUG] Included: org.scala-lang:scala-library:jar:2.10.6 [DEBUG] Included: com.typesafe.sbt:incremental-compiler:jar:0.13.15 [DEBUG] Included: org.scala-lang:scala-compiler:jar:2.10.6 [DEBUG] Included: org.scala-lang:scala-reflect:jar:2.10.6 [DEBUG] Included: com.typesafe.sbt:sbt-interface:jar:0.13.15 [DEBUG] Included: com.typesafe.sbt:compiler-interface:jar:sources:0.13.15 [DEBUG] Configuring mojo net.alchim31.maven:scala-maven-plugin:3.4.2:add-source from plugin realm ClassRealm[plugin>net.alchim31.maven:scala-maven-plugin:3.4.2, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'net.alchim31.maven:scala-maven-plugin:3.4.2:add-source' with basic configurator --> [DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml [DEBUG] (f) sourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\main\java\..\scala [DEBUG] (f) testSourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\test\java\..\scala [DEBUG] (f) useCanonicalPath = true [DEBUG] -- end configuration -- [INFO] Add Source directory: C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\main\scala [INFO] Add Test Source directory: C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\test\scala [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (auto-clean) @ spark-monitoring --- [DEBUG] Configuring mojo org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean' with basic configurator --> [DEBUG] (f) directory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target [DEBUG] (f) excludeDefaultDirectories = false [DEBUG] (f) failOnError = true [DEBUG] (f) followSymLinks = false [DEBUG] (f) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] (f) reportDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] (f) retryOnError = true [DEBUG] (f) skip = false [DEBUG] (f) testOutputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes [DEBUG] -- end configuration -- [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes [INFO] [INFO] --- scala-maven-plugin:3.4.2:compile (default) @ spark-monitoring --- [DEBUG] Configuring mojo net.alchim31.maven:scala-maven-plugin:3.4.2:compile from plugin realm ClassRealm[plugin>net.alchim31.maven:scala-maven-plugin:3.4.2, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'net.alchim31.maven:scala-maven-plugin:3.4.2:compile' with basic configurator --> [DEBUG] (f) analysisCacheFile = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\analysis\compile [DEBUG] (f) args = [-target:jvm-1.8, -feature, -dependencyfile, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target/.scala_dependencies] [DEBUG] (f) checkMultipleScalaVersions = true [DEBUG] (f) compileOrder = mixed [DEBUG] (f) displayCmd = false [DEBUG] (f) encoding = UTF-8 [DEBUG] (f) failOnMultipleScalaVersions = false [DEBUG] (f) forceUseArgFile = false [DEBUG] (f) fork = true [DEBUG] (f) javacArgs = [-source, 1.8, -target, 1.8] [DEBUG] (f) javacGenerateDebugSymbols = true [DEBUG] (f) localRepo = id: local url: file:///C:/Users/KOF4BE/.m2/repository/ layout: default snapshots: [enabled => true, update => always] releases: [enabled => true, update => always]

[DEBUG] (f) localRepository = id: local
url: file:///C:/Users/KOF4BE/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG] (f) notifyCompilation = true
[DEBUG] (f) outputDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes
[DEBUG] (f) pluginArtifacts = [net.alchim31.maven:scala-maven-plugin:maven-plugin:3.4.2:, org.apache.maven:maven-compat:jar:3.5.4:compile, org.apache.maven:maven-model-builder:jar:3.5.4:compile, org.apache.maven:maven-settings:jar:3.5.4:compile, org.apache.maven:maven-settings-builder:jar:3.5.4:compile, org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4:compile, org.sonatype.plexus:plexus-cipher:jar:1.4:compile, org.apache.maven:maven-artifact:jar:3.5.4:compile, org.apache.maven:maven-resolver-provider:jar:3.5.4:compile, org.apache.maven:maven-repository-metadata:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-api:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-util:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-impl:jar:1.1.1:compile, org.codehaus.plexus:plexus-interpolation:jar:1.24:compile, org.eclipse.sisu:org.eclipse.sisu.plexus:jar:0.3.3:compile, javax.enterprise:cdi-api:jar:1.0:compile, javax.annotation:jsr250-api:jar:1.0:compile, org.codehaus.plexus:plexus-component-annotations:jar:1.7.1:compile, org.apache.maven.wagon:wagon-provider-api:jar:3.1.0:compile, org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile, org.apache.maven:maven-core:jar:3.5.4:compile, org.apache.maven:maven-builder-support:jar:3.5.4:compile, org.apache.maven:maven-plugin-api:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-spi:jar:1.1.1:compile, org.apache.maven.shared:maven-shared-utils:jar:3.2.1:compile, org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.3:compile, com.google.inject:guice:jar:no_aop:4.2.0:compile, aopalliance:aopalliance:jar:1.0:compile, com.google.guava:guava:jar:20.0:compile, javax.inject:javax.inject:jar:1:compile, org.apache.commons:commons-lang3:jar:3.5:compile, org.apache.maven.shared:maven-dependency-tree:jar:3.0.1:compile, org.eclipse.aether:aether-util:jar:0.9.0.M2:compile, org.apache.commons:commons-exec:jar:1.3:compile, org.codehaus.plexus:plexus-utils:jar:3.1.0:compile, org.codehaus.plexus:plexus-archiver:jar:3.6.0:compile, org.codehaus.plexus:plexus-io:jar:3.0.1:compile, org.apache.commons:commons-compress:jar:1.16.1:compile, org.objenesis:objenesis:jar:2.6:compile, org.iq80.snappy:snappy:jar:0.4:compile, org.tukaani:xz:jar:1.8:runtime, org.codehaus.plexus:plexus-classworlds:jar:2.5.2:compile, org.apache.maven:maven-project:jar:2.2.1:compile, org.apache.maven:maven-profile:jar:2.2.1:compile, org.apache.maven:maven-artifact-manager:jar:2.2.1:compile, backport-util-concurrent:backport-util-concurrent:jar:3.1:compile, org.apache.maven:maven-plugin-registry:jar:2.2.1:compile, org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile, junit:junit:jar:3.8.1:compile, classworlds:classworlds:jar:1.1-alpha-2:compile, org.apache.maven:maven-archiver:jar:3.2.0:compile, commons-io:commons-io:jar:2.5:compile, org.apache.maven.doxia:doxia-sink-api:jar:1.8:compile, org.apache.maven.doxia:doxia-logging-api:jar:1.8:compile, org.apache.maven:maven-model:jar:3.5.4:compile, org.apache.maven.shared:maven-invoker:jar:3.0.1:compile, com.typesafe.zinc:zinc:jar:0.3.15:compile, org.scala-lang:scala-library:jar:2.10.6:compile, com.typesafe.sbt:incremental-compiler:jar:0.13.15:compile, org.scala-lang:scala-compiler:jar:2.10.6:compile, org.scala-lang:scala-reflect:jar:2.10.6:compile, com.typesafe.sbt:sbt-interface:jar:0.13.15:compile, com.typesafe.sbt:compiler-interface:jar:sources:0.13.15:compile]
[DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml
[DEBUG] (f) reactorProjects = [MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml, MavenProject: com.microsoft.pnp:spark-jobs:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-jobs\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners-loganalytics:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners-loganalytics\pom.xml]
[DEBUG] (f) recompileMode = incremental
[DEBUG] (f) remoteRepos = [ id: bci-mvn
url: https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual
layout: default
snapshots: [enabled => true, update => daily]
releases: [enabled => true, update => daily]
, id: central
url: https://repo.maven.apache.org/maven2
layout: default
snapshots: [enabled => false, update => daily]
releases: [enabled => true, update => daily]
]
[DEBUG] (f) scalaClassName = scala.tools.nsc.Main
[DEBUG] (f) scalaCompatVersion = 2.11
[DEBUG] (f) scalaOrganization = org.scala-lang
[DEBUG] (f) scalaVersion = 2.11.8
[DEBUG] (f) sendJavaToScalac = true
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@2e807c54
[DEBUG] (f) source = 1.8
[DEBUG] (f) sourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\main\java..\scala
[DEBUG] (f) target = 1.8
[DEBUG] (f) useCanonicalPath = true
[DEBUG] (f) useZincServer = false
[DEBUG] (f) zincHost = 127.0.0.1
[DEBUG] (f) zincPort = 3030
[DEBUG] -- end configuration --
[DEBUG] Checking for multiple versions of scala
[DEBUG] building maven31 dependency graph for com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT with Maven31DependencyGraphBuilder
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=112877, ConflictMarker.markTime=24304, ConflictMarker.nodeCount=16, ConflictIdSorter.graphTime=29705, ConflictIdSorter.topsortTime=11882, ConflictIdSorter.conflictIdCount=9, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=387780, ConflictResolver.conflictItemCount=15, DefaultDependencyCollector.collectTime=6836923, DefaultDependencyCollector.transformTime=579510}
[DEBUG] com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT
[DEBUG] org.scala-lang:scala-library:jar:2.11.8:provided
[DEBUG] org.slf4j:slf4j-api:jar:1.7.7:provided
[DEBUG] junit:junit:jar:4.12:test
[DEBUG] org.hamcrest:hamcrest-core:jar:1.3:test
[DEBUG] org.scalatest:scalatest_2.11:jar:3.0.3:test
[DEBUG] org.scalactic:scalactic_2.11:jar:3.0.3:test
[DEBUG] org.scala-lang:scala-reflect:jar:2.11.8:test
[DEBUG] org.scala-lang.modules:scala-xml_2.11:jar:1.0.5:test
[DEBUG] org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:test
[DEBUG] checking [com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT] for scala version
[DEBUG] checking [org.scala-lang:scala-library:jar:2.11.8:provided] for scala version
[DEBUG] includes = [/*.java,/*.scala,]
[DEBUG] excludes = []
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.4.2:testCompile (default) @ spark-monitoring ---
[DEBUG] Configuring mojo net.alchim31.maven:scala-maven-plugin:3.4.2:testCompile from plugin realm ClassRealm[plugin>net.alchim31.maven:scala-maven-plugin:3.4.2, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44]
[DEBUG] Configuring mojo 'net.alchim31.maven:scala-maven-plugin:3.4.2:testCompile' with basic configurator -->
[DEBUG] (f) args = [-target:jvm-1.8, -feature, -dependencyfile, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target/.scala_dependencies]
[DEBUG] (f) checkMultipleScalaVersions = true
[DEBUG] (f) compileOrder = mixed
[DEBUG] (f) displayCmd = false
[DEBUG] (f) encoding = UTF-8
[DEBUG] (f) failOnMultipleScalaVersions = false
[DEBUG] (f) forceUseArgFile = false
[DEBUG] (f) fork = true
[DEBUG] (f) javacArgs = [-source, 1.8, -target, 1.8]
[DEBUG] (f) javacGenerateDebugSymbols = true
[DEBUG] (f) localRepo = id: local
url: file:///C:/Users/KOF4BE/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG] (f) localRepository = id: local
url: file:///C:/Users/KOF4BE/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG] (f) notifyCompilation = true
[DEBUG] (f) pluginArtifacts = [net.alchim31.maven:scala-maven-plugin:maven-plugin:3.4.2:, org.apache.maven:maven-compat:jar:3.5.4:compile, org.apache.maven:maven-model-builder:jar:3.5.4:compile, org.apache.maven:maven-settings:jar:3.5.4:compile, org.apache.maven:maven-settings-builder:jar:3.5.4:compile, org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4:compile, org.sonatype.plexus:plexus-cipher:jar:1.4:compile, org.apache.maven:maven-artifact:jar:3.5.4:compile, org.apache.maven:maven-resolver-provider:jar:3.5.4:compile, org.apache.maven:maven-repository-metadata:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-api:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-util:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-impl:jar:1.1.1:compile, org.codehaus.plexus:plexus-interpolation:jar:1.24:compile, org.eclipse.sisu:org.eclipse.sisu.plexus:jar:0.3.3:compile, javax.enterprise:cdi-api:jar:1.0:compile, javax.annotation:jsr250-api:jar:1.0:compile, org.codehaus.plexus:plexus-component-annotations:jar:1.7.1:compile, org.apache.maven.wagon:wagon-provider-api:jar:3.1.0:compile, org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile, org.apache.maven:maven-core:jar:3.5.4:compile, org.apache.maven:maven-builder-support:jar:3.5.4:compile, org.apache.maven:maven-plugin-api:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-spi:jar:1.1.1:compile, org.apache.maven.shared:maven-shared-utils:jar:3.2.1:compile, org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.3:compile, com.google.inject:guice:jar:no_aop:4.2.0:compile, aopalliance:aopalliance:jar:1.0:compile, com.google.guava:guava:jar:20.0:compile, javax.inject:javax.inject:jar:1:compile, org.apache.commons:commons-lang3:jar:3.5:compile, org.apache.maven.shared:maven-dependency-tree:jar:3.0.1:compile, org.eclipse.aether:aether-util:jar:0.9.0.M2:compile, org.apache.commons:commons-exec:jar:1.3:compile, org.codehaus.plexus:plexus-utils:jar:3.1.0:compile, org.codehaus.plexus:plexus-archiver:jar:3.6.0:compile, org.codehaus.plexus:plexus-io:jar:3.0.1:compile, org.apache.commons:commons-compress:jar:1.16.1:compile, org.objenesis:objenesis:jar:2.6:compile, org.iq80.snappy:snappy:jar:0.4:compile, org.tukaani:xz:jar:1.8:runtime, org.codehaus.plexus:plexus-classworlds:jar:2.5.2:compile, org.apache.maven:maven-project:jar:2.2.1:compile, org.apache.maven:maven-profile:jar:2.2.1:compile, org.apache.maven:maven-artifact-manager:jar:2.2.1:compile, backport-util-concurrent:backport-util-concurrent:jar:3.1:compile, org.apache.maven:maven-plugin-registry:jar:2.2.1:compile, org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile, junit:junit:jar:3.8.1:compile, classworlds:classworlds:jar:1.1-alpha-2:compile, org.apache.maven:maven-archiver:jar:3.2.0:compile, commons-io:commons-io:jar:2.5:compile, org.apache.maven.doxia:doxia-sink-api:jar:1.8:compile, org.apache.maven.doxia:doxia-logging-api:jar:1.8:compile, org.apache.maven:maven-model:jar:3.5.4:compile, org.apache.maven.shared:maven-invoker:jar:3.0.1:compile, com.typesafe.zinc:zinc:jar:0.3.15:compile, org.scala-lang:scala-library:jar:2.10.6:compile, com.typesafe.sbt:incremental-compiler:jar:0.13.15:compile, org.scala-lang:scala-compiler:jar:2.10.6:compile, org.scala-lang:scala-reflect:jar:2.10.6:compile, com.typesafe.sbt:sbt-interface:jar:0.13.15:compile, com.typesafe.sbt:compiler-interface:jar:sources:0.13.15:compile]
[DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml
[DEBUG] (f) reactorProjects = [MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml, MavenProject: com.microsoft.pnp:spark-jobs:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-jobs\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners-loganalytics:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners-loganalytics\pom.xml]
[DEBUG] (f) recompileMode = incremental
[DEBUG] (f) remoteRepos = [ id: bci-mvn
url: https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual
layout: default
snapshots: [enabled => true, update => daily]
releases: [enabled => true, update => daily]
, id: central
url: https://repo.maven.apache.org/maven2
layout: default
snapshots: [enabled => false, update => daily]
releases: [enabled => true, update => daily]
]
[DEBUG] (f) scalaClassName = scala.tools.nsc.Main
[DEBUG] (f) scalaCompatVersion = 2.11
[DEBUG] (f) scalaOrganization = org.scala-lang
[DEBUG] (f) scalaVersion = 2.11.8
[DEBUG] (f) sendJavaToScalac = true
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@2e807c54
[DEBUG] (f) source = 1.8
[DEBUG] (f) target = 1.8
[DEBUG] (f) testAnalysisCacheFile = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\analysis\test-compile
[DEBUG] (f) testOutputDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes
[DEBUG] (f) testSourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\src\test\java..\scala
[DEBUG] (f) useCanonicalPath = true
[DEBUG] (f) useZincServer = false
[DEBUG] (f) zincHost = 127.0.0.1
[DEBUG] (f) zincPort = 3030
[DEBUG] -- end configuration --
[DEBUG] Checking for multiple versions of scala
[DEBUG] building maven31 dependency graph for com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT with Maven31DependencyGraphBuilder
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=18363, ConflictMarker.markTime=17822, ConflictMarker.nodeCount=16, ConflictIdSorter.graphTime=13502, ConflictIdSorter.topsortTime=9181, ConflictIdSorter.conflictIdCount=9, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=200371, ConflictResolver.conflictItemCount=15, DefaultDependencyCollector.collectTime=511459, DefaultDependencyCollector.transformTime=266802}
[DEBUG] com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT
[DEBUG] org.scala-lang:scala-library:jar:2.11.8:provided
[DEBUG] org.slf4j:slf4j-api:jar:1.7.7:provided
[DEBUG] junit:junit:jar:4.12:test
[DEBUG] org.hamcrest:hamcrest-core:jar:1.3:test
[DEBUG] org.scalatest:scalatest_2.11:jar:3.0.3:test
[DEBUG] org.scalactic:scalactic_2.11:jar:3.0.3:test
[DEBUG] org.scala-lang:scala-reflect:jar:2.11.8:test
[DEBUG] org.scala-lang.modules:scala-xml_2.11:jar:1.0.5:test
[DEBUG] org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:test
[DEBUG] checking [com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT] for scala version
[DEBUG] checking [org.scala-lang:scala-library:jar:2.11.8:provided] for scala version
[DEBUG] includes = [/*.java,/*.scala,]
[DEBUG] excludes = []
[INFO] No sources to compile
[INFO]
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ spark-monitoring ---
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=71291, ConflictMarker.markTime=42127, ConflictMarker.nodeCount=79, ConflictIdSorter.graphTime=95055, ConflictIdSorter.topsortTime=23764, ConflictIdSorter.conflictIdCount=32, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=974311, ConflictResolver.conflictItemCount=77, DefaultDependencyCollector.collectTime=231596639, DefaultDependencyCollector.transformTime=1224911}
[DEBUG] org.scalatest:scalatest-maven-plugin:jar:2.0.0:
[DEBUG] org.apache.maven:maven-plugin-api:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-model:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-artifact:jar:3.3.9:compile
[DEBUG] org.eclipse.sisu:org.eclipse.sisu.plexus:jar:0.3.2:compile
[DEBUG] javax.enterprise:cdi-api:jar:1.0:compile
[DEBUG] javax.annotation:jsr250-api:jar:1.0:compile
[DEBUG] org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.2:compile
[DEBUG] org.apache.maven:maven-core:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-settings:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-settings-builder:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-builder-support:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-repository-metadata:jar:3.3.9:compile
[DEBUG] org.apache.maven:maven-model-builder:jar:3.3.9:compile
[DEBUG] com.google.guava:guava:jar:18.0:compile
[DEBUG] org.apache.maven:maven-aether-provider:jar:3.3.9:compile
[DEBUG] org.eclipse.aether:aether-spi:jar:1.0.2.v20150114:compile
[DEBUG] org.eclipse.aether:aether-impl:jar:1.0.2.v20150114:compile
[DEBUG] org.eclipse.aether:aether-api:jar:1.0.2.v20150114:compile
[DEBUG] org.eclipse.aether:aether-util:jar:1.0.2.v20150114:compile
[DEBUG] com.google.inject:guice:jar:no_aop:4.0:compile
[DEBUG] javax.inject:javax.inject:jar:1:compile
[DEBUG] aopalliance:aopalliance:jar:1.0:compile
[DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.21:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.22:compile
[DEBUG] org.codehaus.plexus:plexus-classworlds:jar:2.5.2:compile
[DEBUG] org.codehaus.plexus:plexus-component-annotations:jar:1.6:compile
[DEBUG] org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3:compile
[DEBUG] org.sonatype.plexus:plexus-cipher:jar:1.4:compile
[DEBUG] org.apache.commons:commons-lang3:jar:3.4:compile
[DEBUG] org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile
[DEBUG] org.apache.maven.doxia:doxia-sink-api:jar:1.0:compile
[DEBUG] Created new class realm plugin>org.scalatest:scalatest-maven-plugin:2.0.0
[DEBUG] Importing foreign packages into class realm plugin>org.scalatest:scalatest-maven-plugin:2.0.0
[DEBUG] Imported: < maven.api
[DEBUG] Populating class realm plugin>org.scalatest:scalatest-maven-plugin:2.0.0
[DEBUG] Included: org.scalatest:scalatest-maven-plugin:jar:2.0.0
[DEBUG] Included: javax.enterprise:cdi-api:jar:1.0
[DEBUG] Included: org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.2
[DEBUG] Included: org.apache.maven:maven-builder-support:jar:3.3.9
[DEBUG] Included: com.google.guava:guava:jar:18.0
[DEBUG] Included: org.eclipse.aether:aether-util:jar:1.0.2.v20150114
[DEBUG] Included: com.google.inject:guice:jar:no_aop:4.0
[DEBUG] Included: aopalliance:aopalliance:jar:1.0
[DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.21
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.22
[DEBUG] Included: org.codehaus.plexus:plexus-component-annotations:jar:1.6
[DEBUG] Included: org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3
[DEBUG] Included: org.sonatype.plexus:plexus-cipher:jar:1.4
[DEBUG] Included: org.apache.commons:commons-lang3:jar:3.4
[DEBUG] Included: org.apache.maven.reporting:maven-reporting-api:jar:3.0
[DEBUG] Included: org.apache.maven.doxia:doxia-sink-api:jar:1.0
[DEBUG] Configuring mojo org.scalatest:scalatest-maven-plugin:2.0.0:test from plugin realm ClassRealm[plugin>org.scalatest:scalatest-maven-plugin:2.0.0, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44]
[DEBUG] Configuring mojo 'org.scalatest:scalatest-maven-plugin:2.0.0:test' with basic configurator -->
[DEBUG] (f) debugForkedProcess = false
[DEBUG] (f) debuggerPort = 5005
[DEBUG] (f) filereports = TestSuiteReport.txt
[DEBUG] (f) forkMode = never
[DEBUG] (f) forkedProcessTimeoutInSeconds = 0
[DEBUG] (f) junitxml = .
[DEBUG] (f) logForkedProcessCommand = false
[DEBUG] (f) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes
[DEBUG] (f) parallel = false
[DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml
[DEBUG] (f) reportsDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports
[DEBUG] (f) systemProperties = {log4j.configuration=file:src/test/resources/log4j.properties}
[DEBUG] (f) testOutputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes
[DEBUG] -- end configuration --
[DEBUG] [-R, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\classes C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\test-classes, -o, -f, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports\TestSuiteReport.txt, -u, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\target\surefire-reports.]
Discovery starting.
Discovery completed in 193 milliseconds.
Run starting. Expected test count is: 0
DiscoverySuite:
Run completed in 212 milliseconds.
Total number of tests run: 0
Suites: completed 1, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @ spark-monitoring ---
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=41047, ConflictMarker.markTime=29164, ConflictMarker.nodeCount=38, ConflictIdSorter.graphTime=73991, ConflictIdSorter.topsortTime=12962, ConflictIdSorter.conflictIdCount=15, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=345654, ConflictResolver.conflictItemCount=35, DefaultDependencyCollector.collectTime=104036911, DefaultDependencyCollector.transformTime=520101}
[DEBUG] org.apache.maven.plugins:maven-install-plugin:jar:2.4:
[DEBUG] org.apache.maven:maven-plugin-api:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-project:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-settings:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-profile:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-plugin-registry:jar:2.0.6:compile
[DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile
[DEBUG] junit:junit:jar:3.8.1:compile
[DEBUG] classworlds:classworlds:jar:1.1-alpha-2:compile
[DEBUG] org.apache.maven:maven-model:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-artifact-manager:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-repository-metadata:jar:2.0.6:compile
[DEBUG] org.apache.maven:maven-artifact:jar:2.0.6:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.5:compile
[DEBUG] org.codehaus.plexus:plexus-digest:jar:1.0:compile
[DEBUG] Created new class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.4
[DEBUG] Importing foreign packages into class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.4
[DEBUG] Imported: < maven.api
[DEBUG] Populating class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.4
[DEBUG] Included: org.apache.maven.plugins:maven-install-plugin:jar:2.4
[DEBUG] Included: junit:junit:jar:3.8.1
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.5
[DEBUG] Included: org.codehaus.plexus:plexus-digest:jar:1.0
[DEBUG] Configuring mojo org.apache.maven.plugins:maven-install-plugin:2.4:install from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-install-plugin:2.4, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44]
[DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-install-plugin:2.4:install' with basic configurator -->
[DEBUG] (f) artifact = com.microsoft.pnp:spark-monitoring:pom:1.0-SNAPSHOT
[DEBUG] (f) attachedArtifacts = []
[DEBUG] (f) createChecksum = false
[DEBUG] (f) localRepository = id: local
url: file:///C:/Users/KOF4BE/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG] (f) packaging = pom
[DEBUG] (f) pomFile = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml
[DEBUG] (s) skip = false
[DEBUG] (f) updateReleaseInfo = false
[DEBUG] -- end configuration --
[INFO] Installing C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml to C:\Users\KOF4BE.m2\repository\com\microsoft\pnp\spark-monitoring\1.0-SNAPSHOT\spark-monitoring-1.0-SNAPSHOT.pom
[DEBUG] Writing tracking file C:\Users\KOF4BE.m2\repository\com\microsoft\pnp\spark-monitoring\1.0-SNAPSHOT_remote.repositories
[DEBUG] Installing com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT/maven-metadata.xml to C:\Users\KOF4BE.m2\repository\com\microsoft\pnp\spark-monitoring\1.0-SNAPSHOT\maven-metadata-local.xml
[DEBUG] Installing com.microsoft.pnp:spark-monitoring/maven-metadata.xml to C:\Users\KOF4BE.m2\repository\com\microsoft\pnp\spark-monitoring\maven-metadata-local.xml
[INFO]
[INFO] -----------------< com.microsoft.pnp:spark-listeners >------------------
[INFO] Building spark-listeners 1.0-SNAPSHOT [2/4]
[INFO] --------------------------------[ jar ]---------------------------------
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] === PROJECT BUILD PLAN ================================================
[DEBUG] Project: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT
[DEBUG] Dependencies (collect): []
[DEBUG] Dependencies (resolve): [compile, runtime, test]
[DEBUG] Repositories (dependencies): [bci-mvn (https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual, default, releases+snapshots), central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] Repositories (plugins) : [bci-mvn (https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual, default, releases+snapshots), central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean (default-clean)
[DEBUG] Style: Regular
[DEBUG] Configuration:


${maven.clean.excludeDefaultDirectories}
${maven.clean.failOnError}
${maven.clean.followSymLinks}


${maven.clean.retryOnError}
${maven.clean.skip}

${maven.clean.verbose}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:add-source (default)
[DEBUG] Style: Regular
[DEBUG] Configuration:

${project}


${maven.scala.useCanonicalPath}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean (auto-clean)
[DEBUG] Style: Regular
[DEBUG] Configuration:


${maven.clean.excludeDefaultDirectories}
${maven.clean.failOnError}
${maven.clean.followSymLinks}


${maven.clean.retryOnError}
${maven.clean.skip}

${maven.clean.verbose}

[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-resources-plugin:2.6:resources (default-resources)
[DEBUG] Style: Regular
[DEBUG] Configuration:


${encoding}
${maven.resources.escapeString}
${maven.resources.escapeWindowsPaths}
${maven.resources.includeEmptyDirs}

${maven.resources.overwrite}



${maven.resources.supportMultiLineFiltering}



[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:compile (default)
[DEBUG] Style: Regular
[DEBUG] Configuration:

${addJavacArgs}
${addScalacArgs}
${addZincArgs}
${analysisCacheFile}

-target:jvm-1.8
-feature
-dependencyfile
C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target/.scala_dependencies

${maven.scala.checkConsistency}
${compileOrder}
${displayCmd}
${project.build.sourceEncoding}




-source
1.8
-target
1.8${javacArgs}
${javacGenerateDebugSymbols}
${localRepository}
${localRepository}
${notifyCompilation}
${project.build.outputDirectory}

${project}

incremental
${project.remoteArtifactRepositories}
${maven.scala.className}
2.11
${scala.home}
${scala.organization}
2.11.8

${session}

${maven.compiler.source} ${maven.compiler.target} ${maven.scala.useCanonicalPath} ${useZincServer} ${zincHost} ${zincPort} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-resources-plugin:2.6:testResources (default-testResources) [DEBUG] Style: Regular [DEBUG] Configuration: ${encoding} ${maven.resources.escapeString} ${maven.resources.escapeWindowsPaths} ${maven.resources.includeEmptyDirs} ${maven.resources.overwrite} ${maven.test.skip} ${maven.resources.supportMultiLineFiltering} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile (default-testCompile) [DEBUG] Style: Regular [DEBUG] Configuration: -Xlint ${maven.compiler.compilerId} ${maven.compiler.compilerReuseStrategy} ${maven.compiler.compilerVersion} ${maven.compiler.debug} ${maven.compiler.debuglevel} ${encoding} ${maven.compiler.executable} ${maven.compiler.failOnError} ${maven.compiler.failOnWarning} ${maven.compiler.forceJavacCompilerUse} ${maven.compiler.fork} ${maven.compiler.maxmem} ${maven.compiler.meminitial} ${maven.compiler.optimize} ${maven.compiler.parameters} ${maven.compiler.release} ${maven.compiler.showDeprecation} ${maven.compiler.showWarnings} ${maven.test.skip} ${maven.compiler.skipMultiThreadWarning} 1.8 ${lastModGranularityMs} 1.8 ${maven.compiler.testRelease} ${maven.compiler.testSource} ${maven.compiler.testTarget} ${maven.compiler.useIncrementalCompilation} ${maven.compiler.verbose} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: net.alchim31.maven:scala-maven-plugin:3.4.2:testCompile (default) [DEBUG] Style: Regular [DEBUG] Configuration: ${addJavacArgs} ${addScalacArgs} ${addZincArgs} -target:jvm-1.8 -feature -dependencyfile C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target/.scala_dependencies ${maven.scala.checkConsistency} ${compileOrder} ${displayCmd} ${project.build.sourceEncoding} -source 1.8 -target 1.8${javacArgs} ${javacGenerateDebugSymbols} ${localRepository} ${localRepository} ${notifyCompilation} ${project} incremental ${project.remoteArtifactRepositories} ${maven.scala.className} 2.11 ${scala.home} ${scala.organization} 2.11.8 ${session} ${maven.test.skip} ${maven.compiler.source} ${maven.compiler.target} ${testAnalysisCacheFile} ${maven.scala.useCanonicalPath} ${useZincServer} ${zincHost} ${zincPort} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-surefire-plugin:2.22.0:test (default-test) [DEBUG] Style: Regular [DEBUG] Configuration: ${maven.test.additionalClasspath} ${argLine} ${childDelegation} ${maven.test.dependency.excludes} ${maven.surefire.debug} ${dependenciesToScan} ${disableXmlReport} ${enableAssertions} ${surefire.encoding} ${excludedGroups} ${surefire.excludesFile} ${surefire.failIfNoSpecifiedTests} ${failIfNoTests} ${forkCount} ${forkMode} ${surefire.exitTimeout} ${surefire.timeout} ${groups} ${surefire.includesFile} ${junitArtifactName} ${junitPlatformArtifactName} ${jvm} ${objectFactory} ${parallel} ${parallelOptimized} ${surefire.parallel.forcedTimeout} ${surefire.parallel.timeout} ${perCoreThreadCount} ${plugin.artifactMap} ${surefire.printSummary} ${project.artifactMap} ${maven.test.redirectTestOutputToFile} ${surefire.reportFormat} ${surefire.reportNameSuffix} ${surefire.rerunFailingTestsCount} ${reuseForks} ${surefire.runOrder} ${surefire.shutdown} ${maven.test.skip} ${surefire.skipAfterFailureCount} ${maven.test.skip.exec} ${skipTests} ${surefire.suiteXmlFiles} ${tempDir} ${test} ${maven.test.failure.ignore} ${testNGArtifactName} ${threadCount} ${threadCountClasses} ${threadCountMethods} ${threadCountSuites} ${trimStackTrace} ${surefire.useFile} ${surefire.useManifestOnlyJar} ${surefire.useSystemClassLoader} ${useUnlimitedThreads} ${basedir} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.scalatest:scalatest-maven-plugin:2.0.0:test (test) [DEBUG] Style: Regular [DEBUG] Configuration: ${argLine} ${config} ${debugArgLine} ${debugForkedProcess} ${debuggerPort} TestSuiteReport.txt never ${timeout} ${htmlreporters} ${junitClasses} . ${logForkedProcessCommand} ${membersOnlySuites} ${memoryFiles} ${project.build.outputDirectory} false ${reporters} C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target/surefire-reports ${runpath} false ${spanScaleFactor} ${stderr} ${stdout} ${suffixes} ${suites} file:src/test/resources/log4j.properties ${tagsToExclude} ${tagsToInclude} ${maven.test.failure.ignore} ${testNGXMLFiles} ${project.build.testOutputDirectory} ${tests} ${testsFiles} ${wildcardSuites} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-jar-plugin:2.4:jar (default-jar) [DEBUG] Style: Regular [DEBUG] Configuration: ${jar.finalName} ${jar.forceCreation} ${jar.skipIfEmpty} ${jar.useDefaultManifestFile} [DEBUG] ----------------------------------------------------------------------- [DEBUG] Goal: org.apache.maven.plugins:maven-install-plugin:2.4:install (default-install) [DEBUG] Style: Regular [DEBUG] Configuration: ${createChecksum} ${localRepository} ${maven.install.skip} ${updateReleaseInfo} [DEBUG] ======================================================================= [DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=1764454, ConflictMarker.markTime=991594, ConflictMarker.nodeCount=2338, ConflictIdSorter.graphTime=589771, ConflictIdSorter.topsortTime=136642, ConflictIdSorter.conflictIdCount=169, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=9630777, ConflictResolver.conflictItemCount=467, DefaultDependencyCollector.collectTime=1959508826, DefaultDependencyCollector.transformTime=13128901} [DEBUG] com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT [DEBUG] org.scala-lang:scala-library:jar:2.11.8:provided [DEBUG] org.mockito:mockito-core:jar:1.10.19:test [DEBUG] org.hamcrest:hamcrest-core:jar:1.1:test [DEBUG] org.objenesis:objenesis:jar:2.1:provided [DEBUG] org.apache.spark:spark-core_2.11:jar:2.4.0:provided [DEBUG] org.apache.avro:avro:jar:1.8.2:provided [DEBUG] org.codehaus.jackson:jackson-core-asl:jar:1.9.13:provided [DEBUG] org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:provided [DEBUG] com.thoughtworks.paranamer:paranamer:jar:2.7:provided [DEBUG] org.apache.commons:commons-compress:jar:1.8.1:provided [DEBUG] org.tukaani:xz:jar:1.5:provided [DEBUG] org.apache.avro:avro-mapred:jar:hadoop2:1.8.2:provided [DEBUG] org.apache.avro:avro-ipc:jar:1.8.2:provided [DEBUG] commons-codec:commons-codec:jar:1.10:provided (scope managed from compile) (version managed from 1.9) [DEBUG] com.twitter:chill_2.11:jar:0.9.3:provided [DEBUG] com.esotericsoftware:kryo-shaded:jar:4.0.2:provided [DEBUG] com.esotericsoftware:minlog:jar:1.3.0:provided [DEBUG] com.twitter:chill-java:jar:0.9.3:provided [DEBUG] org.apache.xbean:xbean-asm6-shaded:jar:4.8:provided [DEBUG] org.apache.hadoop:hadoop-client:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-common:jar:2.6.5:provided [DEBUG] commons-cli:commons-cli:jar:1.2:provided [DEBUG] xmlenc:xmlenc:jar:0.52:provided [DEBUG] commons-httpclient:commons-httpclient:jar:3.1:provided [DEBUG] commons-io:commons-io:jar:2.4:provided [DEBUG] commons-collections:commons-collections:jar:3.2.2:provided [DEBUG] commons-configuration:commons-configuration:jar:1.6:provided [DEBUG] commons-digester:commons-digester:jar:1.8:provided [DEBUG] commons-beanutils:commons-beanutils:jar:1.7.0:provided [DEBUG] commons-beanutils:commons-beanutils-core:jar:1.8.0:provided [DEBUG] com.google.code.gson:gson:jar:2.2.4:provided [DEBUG] org.apache.hadoop:hadoop-auth:jar:2.6.5:provided [DEBUG] org.apache.httpcomponents:httpclient:jar:4.5.4:provided (scope managed from compile) (version managed from 4.2.5) [DEBUG] org.apache.httpcomponents:httpcore:jar:4.4.7:provided [DEBUG] org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15:provided [DEBUG] org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15:provided [DEBUG] org.apache.directory.api:api-asn1-api:jar:1.0.0-M20:provided [DEBUG] org.apache.directory.api:api-util:jar:1.0.0-M20:provided [DEBUG] org.apache.curator:curator-client:jar:2.6.0:provided [DEBUG] org.htrace:htrace-core:jar:3.0.4:provided [DEBUG] org.apache.hadoop:hadoop-hdfs:jar:2.6.5:provided [DEBUG] org.mortbay.jetty:jetty-util:jar:6.1.26:provided [DEBUG] xerces:xercesImpl:jar:2.9.1:provided [DEBUG] xml-apis:xml-apis:jar:1.3.04:provided [DEBUG] org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-yarn-client:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-yarn-server-common:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-yarn-api:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-yarn-common:jar:2.6.5:provided [DEBUG] javax.xml.bind:jaxb-api:jar:2.2.2:provided [DEBUG] javax.xml.stream:stax-api:jar:1.0-2:provided [DEBUG] org.codehaus.jackson:jackson-jaxrs:jar:1.9.13:provided [DEBUG] org.codehaus.jackson:jackson-xc:jar:1.9.13:provided [DEBUG] org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.6.5:provided [DEBUG] org.apache.hadoop:hadoop-annotations:jar:2.6.5:provided [DEBUG] org.apache.spark:spark-launcher_2.11:jar:2.4.0:provided [DEBUG] org.apache.spark:spark-kvstore_2.11:jar:2.4.0:provided [DEBUG] org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:provided [DEBUG] com.fasterxml.jackson.core:jackson-core:jar:2.6.7:provided [DEBUG] com.fasterxml.jackson.core:jackson-annotations:jar:2.6.7:provided [DEBUG] org.apache.spark:spark-network-common_2.11:jar:2.4.0:provided [DEBUG] org.apache.spark:spark-network-shuffle_2.11:jar:2.4.0:provided [DEBUG] org.apache.spark:spark-unsafe_2.11:jar:2.4.0:provided [DEBUG] javax.activation:activation:jar:1.1.1:provided [DEBUG] org.apache.curator:curator-recipes:jar:2.6.0:provided [DEBUG] org.apache.curator:curator-framework:jar:2.6.0:provided [DEBUG] com.google.guava:guava:jar:16.0.1:provided [DEBUG] org.apache.zookeeper:zookeeper:jar:3.4.6:provided [DEBUG] javax.servlet:javax.servlet-api:jar:3.1.0:provided [DEBUG] org.apache.commons:commons-lang3:jar:3.5:provided [DEBUG] org.apache.commons:commons-math3:jar:3.4.1:provided [DEBUG] com.google.code.findbugs:jsr305:jar:1.3.9:provided [DEBUG] org.slf4j:jul-to-slf4j:jar:1.7.16:provided [DEBUG] org.slf4j:jcl-over-slf4j:jar:1.7.16:provided [DEBUG] org.slf4j:slf4j-log4j12:jar:1.7.16:provided [DEBUG] com.ning:compress-lzf:jar:1.0.3:provided [DEBUG] org.xerial.snappy:snappy-java:jar:1.1.7.1:provided [DEBUG] org.lz4:lz4-java:jar:1.4.0:provided [DEBUG] com.github.luben:zstd-jni:jar:1.3.2-2:provided [DEBUG] org.roaringbitmap:RoaringBitmap:jar:0.5.11:provided [DEBUG] commons-net:commons-net:jar:3.1:provided [DEBUG] org.json4s:json4s-jackson_2.11:jar:3.5.3:provided [DEBUG] org.json4s:json4s-core_2.11:jar:3.5.3:provided [DEBUG] org.json4s:json4s-ast_2.11:jar:3.5.3:provided [DEBUG] org.json4s:json4s-scalap_2.11:jar:3.5.3:provided [DEBUG] org.glassfish.jersey.core:jersey-client:jar:2.22.2:provided [DEBUG] javax.ws.rs:javax.ws.rs-api:jar:2.0.1:provided [DEBUG] org.glassfish.hk2:hk2-api:jar:2.4.0-b34:provided [DEBUG] org.glassfish.hk2:hk2-utils:jar:2.4.0-b34:provided [DEBUG] org.glassfish.hk2.external:aopalliance-repackaged:jar:2.4.0-b34:provided [DEBUG] org.glassfish.hk2.external:javax.inject:jar:2.4.0-b34:provided [DEBUG] org.glassfish.hk2:hk2-locator:jar:2.4.0-b34:provided [DEBUG] org.javassist:javassist:jar:3.18.1-GA:provided [DEBUG] org.glassfish.jersey.core:jersey-common:jar:2.22.2:provided [DEBUG] javax.annotation:javax.annotation-api:jar:1.2:provided [DEBUG] org.glassfish.jersey.bundles.repackaged:jersey-guava:jar:2.22.2:provided [DEBUG] org.glassfish.hk2:osgi-resource-locator:jar:1.0.1:provided [DEBUG] org.glassfish.jersey.core:jersey-server:jar:2.22.2:provided [DEBUG] org.glassfish.jersey.media:jersey-media-jaxb:jar:2.22.2:provided [DEBUG] javax.validation:validation-api:jar:1.1.0.Final:provided [DEBUG] org.glassfish.jersey.containers:jersey-container-servlet:jar:2.22.2:provided [DEBUG] org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.22.2:provided [DEBUG] io.netty:netty-all:jar:4.1.17.Final:provided [DEBUG] io.netty:netty:jar:3.9.9.Final:provided [DEBUG] com.clearspring.analytics:stream:jar:2.7.0:provided [DEBUG] io.dropwizard.metrics:metrics-jvm:jar:3.1.5:provided [DEBUG] io.dropwizard.metrics:metrics-json:jar:3.1.5:provided (scope managed from compile) (version managed from 3.1.5) [DEBUG] io.dropwizard.metrics:metrics-graphite:jar:3.1.5:provided [DEBUG] com.fasterxml.jackson.core:jackson-databind:jar:2.6.7.1:provided (scope managed from compile) (version managed from 2.6.7.1) [DEBUG] com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.6.7.1:provided [DEBUG] com.fasterxml.jackson.module:jackson-module-paranamer:jar:2.7.9:provided [DEBUG] org.apache.ivy:ivy:jar:2.4.0:provided [DEBUG] oro:oro:jar:2.0.8:provided [DEBUG] net.razorvine:pyrolite:jar:4.13:provided [DEBUG] net.sf.py4j:py4j:jar:0.10.7:provided [DEBUG] org.apache.spark:spark-tags_2.11:jar:2.4.0:provided [DEBUG] org.apache.commons:commons-crypto:jar:1.0.0:provided [DEBUG] org.spark-project.spark:unused:jar:1.0.0:provided [DEBUG] org.apache.spark:spark-sql_2.11:jar:2.4.0:provided [DEBUG] com.univocity:univocity-parsers:jar:2.7.3:provided [DEBUG] org.apache.spark:spark-sketch_2.11:jar:2.4.0:provided [DEBUG] org.apache.spark:spark-catalyst_2.11:jar:2.4.0:provided [DEBUG] org.codehaus.janino:janino:jar:3.0.9:provided [DEBUG] org.codehaus.janino:commons-compiler:jar:3.0.9:provided [DEBUG] org.antlr:antlr4-runtime:jar:4.7:provided [DEBUG] org.apache.orc:orc-core:jar:nohive:1.5.2:provided [DEBUG] org.apache.orc:orc-shims:jar:1.5.2:provided [DEBUG] com.google.protobuf:protobuf-java:jar:2.5.0:provided [DEBUG] commons-lang:commons-lang:jar:2.6:provided [DEBUG] io.airlift:aircompressor:jar:0.10:provided [DEBUG] org.apache.orc:orc-mapreduce:jar:nohive:1.5.2:provided [DEBUG] org.apache.parquet:parquet-column:jar:1.10.0:provided [DEBUG] org.apache.parquet:parquet-common:jar:1.10.0:provided [DEBUG] org.apache.parquet:parquet-encoding:jar:1.10.0:provided [DEBUG] org.apache.parquet:parquet-hadoop:jar:1.10.0:provided [DEBUG] org.apache.parquet:parquet-format:jar:2.4.0:provided [DEBUG] org.apache.parquet:parquet-jackson:jar:1.10.0:provided [DEBUG] org.apache.arrow:arrow-vector:jar:0.10.0:provided [DEBUG] org.apache.arrow:arrow-format:jar:0.10.0:provided [DEBUG] org.apache.arrow:arrow-memory:jar:0.10.0:provided [DEBUG] joda-time:joda-time:jar:2.9.9:provided [DEBUG] com.carrotsearch:hppc:jar:0.7.2:provided [DEBUG] com.vlkan:flatbuffers:jar:1.2.0-3f79e055:provided [DEBUG] org.apache.spark:spark-core_2.11:jar:tests:2.4.0:test [DEBUG] org.apache.spark:spark-streaming_2.11:jar:2.4.0:provided [DEBUG] org.slf4j:slf4j-api:jar:1.7.7:provided [DEBUG] log4j:log4j:jar:1.2.17:provided [DEBUG] io.dropwizard.metrics:metrics-core:jar:3.1.5:provided [DEBUG] org.eclipse.jetty:jetty-server:jar:9.3.20.v20170531:provided [DEBUG] org.eclipse.jetty:jetty-http:jar:9.3.20.v20170531:provided [DEBUG] org.eclipse.jetty:jetty-util:jar:9.3.20.v20170531:provided [DEBUG] org.eclipse.jetty:jetty-io:jar:9.3.20.v20170531:provided [DEBUG] com.github.dwickern:scala-nameof_2.11:jar:1.0.3:provided [DEBUG] org.scala-lang:scala-reflect:jar:2.11.8:provided [DEBUG] junit:junit:jar:4.12:test [DEBUG] org.scalatest:scalatest_2.11:jar:3.0.3:test [DEBUG] org.scalactic:scalactic_2.11:jar:3.0.3:test [DEBUG] org.scala-lang.modules:scala-xml_2.11:jar:1.0.5:provided [DEBUG] org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:provided [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ spark-listeners --- [DEBUG] Configuring mojo org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean' with basic configurator --> [DEBUG] (f) directory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target [DEBUG] (f) excludeDefaultDirectories = false [DEBUG] (f) failOnError = true [DEBUG] (f) followSymLinks = false [DEBUG] (f) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] (f) reportDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] (f) retryOnError = true [DEBUG] (f) skip = false [DEBUG] (f) testOutputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\test-classes [DEBUG] -- end configuration -- [INFO] Deleting C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [INFO] Deleting directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\test-classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [INFO] [INFO] --- scala-maven-plugin:3.4.2:add-source (default) @ spark-listeners --- [DEBUG] Configuring mojo net.alchim31.maven:scala-maven-plugin:3.4.2:add-source from plugin realm ClassRealm[plugin>net.alchim31.maven:scala-maven-plugin:3.4.2, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'net.alchim31.maven:scala-maven-plugin:3.4.2:add-source' with basic configurator --> [DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml [DEBUG] (f) sourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\java\..\scala [DEBUG] (f) testSourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\test\java\..\scala [DEBUG] (f) useCanonicalPath = true [DEBUG] -- end configuration -- [INFO] Add Source directory: C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\scala [INFO] Add Test Source directory: C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\test\scala [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (auto-clean) @ spark-listeners --- [DEBUG] Configuring mojo org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-clean-plugin:3.1.0, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-clean-plugin:3.1.0:clean' with basic configurator --> [DEBUG] (f) directory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target [DEBUG] (f) excludeDefaultDirectories = false [DEBUG] (f) failOnError = true [DEBUG] (f) followSymLinks = false [DEBUG] (f) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] (f) reportDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] (f) retryOnError = true [DEBUG] (f) skip = false [DEBUG] (f) testOutputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\test-classes [DEBUG] -- end configuration -- [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\test-classes [DEBUG] Skipping non-existing directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-listeners --- [DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=85873, ConflictMarker.markTime=34025, ConflictMarker.nodeCount=77, ConflictIdSorter.graphTime=30785, ConflictIdSorter.topsortTime=17282, ConflictIdSorter.conflictIdCount=26, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=364017, ConflictResolver.conflictItemCount=74, DefaultDependencyCollector.collectTime=241010843, DefaultDependencyCollector.transformTime=550345} [DEBUG] org.apache.maven.plugins:maven-resources-plugin:jar:2.6: [DEBUG] org.apache.maven:maven-plugin-api:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-project:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-profile:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-artifact-manager:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-plugin-registry:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-core:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.6:compile [DEBUG] org.apache.maven.reporting:maven-reporting-api:jar:2.0.6:compile [DEBUG] org.apache.maven.doxia:doxia-sink-api:jar:1.0-alpha-7:compile [DEBUG] org.apache.maven:maven-repository-metadata:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-error-diagnostics:jar:2.0.6:compile [DEBUG] commons-cli:commons-cli:jar:1.0:compile [DEBUG] org.apache.maven:maven-plugin-descriptor:jar:2.0.6:compile [DEBUG] org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4:compile [DEBUG] classworlds:classworlds:jar:1.1:compile [DEBUG] org.apache.maven:maven-artifact:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-settings:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-model:jar:2.0.6:compile [DEBUG] org.apache.maven:maven-monitor:jar:2.0.6:compile [DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile [DEBUG] junit:junit:jar:3.8.1:compile [DEBUG] org.codehaus.plexus:plexus-utils:jar:2.0.5:compile [DEBUG] org.apache.maven.shared:maven-filtering:jar:1.1:compile [DEBUG] org.sonatype.plexus:plexus-build-api:jar:0.0.4:compile [DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.13:compile [DEBUG] Created new class realm plugin>org.apache.maven.plugins:maven-resources-plugin:2.6 [DEBUG] Importing foreign packages into class realm plugin>org.apache.maven.plugins:maven-resources-plugin:2.6 [DEBUG] Imported: < maven.api [DEBUG] Populating class realm plugin>org.apache.maven.plugins:maven-resources-plugin:2.6 [DEBUG] Included: org.apache.maven.plugins:maven-resources-plugin:jar:2.6 [DEBUG] Included: org.apache.maven.reporting:maven-reporting-api:jar:2.0.6 [DEBUG] Included: org.apache.maven.doxia:doxia-sink-api:jar:1.0-alpha-7 [DEBUG] Included: commons-cli:commons-cli:jar:1.0 [DEBUG] Included: org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4 [DEBUG] Included: junit:junit:jar:3.8.1 [DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:2.0.5 [DEBUG] Included: org.apache.maven.shared:maven-filtering:jar:1.1 [DEBUG] Included: org.sonatype.plexus:plexus-build-api:jar:0.0.4 [DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.13 [DEBUG] Configuring mojo org.apache.maven.plugins:maven-resources-plugin:2.6:resources from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-resources-plugin:2.6, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-resources-plugin:2.6:resources' with basic configurator --> [DEBUG] (f) buildFilters = [] [DEBUG] (f) encoding = UTF-8 [DEBUG] (f) escapeWindowsPaths = true [DEBUG] (s) includeEmptyDirs = false [DEBUG] (s) outputDirectory = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes [DEBUG] (s) overwrite = false [DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml [DEBUG] (s) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\resources, PatternSet [includes: {}, excludes: {}]}}] [DEBUG] (f) session = org.apache.maven.execution.MavenSession@2e807c54 [DEBUG] (f) supportMultiLineFiltering = false [DEBUG] (f) useBuildFilters = true [DEBUG] (s) useDefaultDelimiters = true [DEBUG] -- end configuration -- [DEBUG] properties used {env.NUMBER_OF_PROCESSORS=8, env.USERPROFILE=C:\Users\KOF4BE, java.specification.version=12, sun.cpu.isalist=amd64, env.ML_DP_SERVER_NG2=\\SI0VM1385.de.bosch.com\eForms\WTS\NG2, sun.arch.data.model=64, env.PROGRAMW6432=C:\Program Files, env.SAPKM_USER_TEMP=C:\Users\KOF4BE\AppData\Local, java.vendor.url=https://java.oracle.com/, env.OS=Windows_NT, sun.boot.library.path=C:\Program Files\Java\jdk-12.0.1\bin, sun.java.command=org.codehaus.plexus.classworlds.launcher.Launcher clean install -e -X, env.SYSTEMROOT=C:\WINDOWS, jdk.debug=release, maven.version=3.6.1, java.specification.vendor=Oracle Corporation, java.version.date=2019-04-16, java.home=C:\Program Files\Java\jdk-12.0.1, java.vm.specification.vendor=Oracle Corporation, java.specification.name=Java Platform API Specification, env.LOCALAPPDATA=C:\Users\KOF4BE\AppData\Local, env.USERDOMAIN_ROAMINGPROFILE=DE, user.script=, sun.management.compiler=HotSpot 64-Bit Tiered Compilers, java.runtime.version=12.0.1+12, env.PATH=C:\Program Files\Java\jdk-12.0.1\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\winutils\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library\bin;C:\Program Files (x86)\apache-maven-3.6.1\bin;C:\Program Files\PuTTY\;C:\Program Files (x86)\scala\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\Program Files\PuTTY\;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\Program Files\PuTTY\;C:\Users\KOF4BE\AppData\Local\Microsoft\WindowsApps;C:\Users\KOF4BE\AppData\Local\Programs\Microsoft VS Code\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Scripts;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library\mingw-w64\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library;C:\Users\KOF4BE\AppData\Local\Programs\Git\cmd;C:\Program Files\JetBrains\IntelliJ IDEA 2019.1.1\bin;C:\Users\KOF4BE\AppData\Local\GitHubDesktop\bin, env.PUBLIC=C:\Users\Public, env.EXEC_DIR=C:\Users\KOF4BE\Downloads\spark-monitoring-master\src, env.COMMONPROGRAMW6432=C:\Program Files\Common Files, file.encoding=Cp1252, env.COMPUTERNAME=BE12Z195, env.HOMEPATH=\, env.APPDATA=C:\Users\KOF4BE\AppData\Roaming, java.io.tmpdir=C:\Users\KOF4BE\AppData\Local\Temp\, env.SSF_LIBRARY_PATH=C:\Program Files (x86)\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll, java.version=12.0.1, java.vm.specification.name=Java Virtual Machine Specification, env.HOMESHARE=\\be60fs02.de.bosch.com\KOF4BE$, java.library.path=C:\Program Files\Java\jdk-12.0.1\bin;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\Program Files\Java\jdk-12.0.1\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\winutils\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library\bin;C:\Program Files (x86)\apache-maven-3.6.1\bin;C:\Program Files\PuTTY\;C:\Program Files (x86)\scala\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\Program Files\PuTTY\;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\7-Zip;C:\Program Files\PuTTY\;C:\Users\KOF4BE\AppData\Local\Microsoft\WindowsApps;C:\Users\KOF4BE\AppData\Local\Programs\Microsoft VS Code\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Scripts;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library\mingw-w64\bin;C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library;C:\Users\KOF4BE\AppData\Local\Programs\Git\cmd;C:\Program Files\JetBrains\IntelliJ IDEA 2019.1.1\bin;C:\Users\KOF4BE\AppData\Local\GitHubDesktop\bin;., java.vendor=Oracle Corporation, env.ERROR_CODE=0, classworlds.conf=C:\Program Files (x86)\apache-maven-3.6.1\bin\..\bin\m2.conf, sun.io.unicode.encoding=UnicodeLittle, sun.desktop=windows, env.HADOOP_HOME=C:\winutils, java.vm.specification.version=12, os.name=Windows 10, env.=EXITCODE=00000000, env.=::=::\, maven.compiler.source=1.8, user.home=C:\Users\KOF4BE, env.ALLUSERSPROFILE=C:\ProgramData, env.SESSIONNAME=Console, java.awt.graphicsenv=sun.awt.Win32GraphicsEnvironment, env.PYTHONPATH=C:\Users\KOF4BE\AppData\Local\Continuum\anaconda3\Library, path.separator=;, os.version=10.0, env.CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher, env.PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC, java.vm.name=Java HotSpot(TM) 64-Bit Server VM, env.USERNAME=KOF4BE, os.arch=amd64, maven.multiModuleProjectDirectory=C:\Users\KOF4BE\Downloads\spark-monitoring-master\src, user.language.format=de, env.MAVEN_PROJECTBASEDIR=C:\Users\KOF4BE\Downloads\spark-monitoring-master\src, env.SNC_LIB_64=C:\Program Files\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll, java.vm.info=mixed mode, sharing, env.TEMP=C:\Users\KOF4BE\AppData\Local\Temp, java.class.version=56.0, env.ML_DP_SERVER_NG=\\SI0VM1385.de.bosch.com\eForms\WTS\NG, awt.toolkit=sun.awt.windows.WToolkit, sun.jnu.encoding=Cp1252, slf4j.version=1.7.7, env.UATDATA=C:\WINDOWS\CCM\UATData\D9F8C395-CAB8-491d-B8AC-179A1FE1BE77, user.country.format=DE, maven.build.version=Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-04T21:00:29+02:00), maven.home=C:\Program Files (x86)\apache-maven-3.6.1\bin\.., sun.stderr.encoding=cp437, env.SAPLOGON_INI_FILE=C:\ProgramData\sap\saplogon.ini, env.JAVA_HOME=C:\Program Files\Java\jdk-12.0.1, env.PROGRAMFILES=C:\Program Files, file.separator=\, java.vm.compressedOopsMode=32-bit, spark.version=2.4.0, env.CLASSWORLDS_JAR="C:\Program Files (x86)\apache-maven-3.6.1\bin\..\boot\plexus-classworlds-2.6.0.jar", line.separator= , env.PROMPT=$P$G, env.PROCESSOR_REVISION=8e0a, env.PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 142 Stepping 10, GenuineIntel, env.PROGRAMDATA=C:\ProgramData, user.name=KOF4BE, env.SBT_HOME=C:\Program Files (x86)\scala\bin, env.SNC_LIB_32=C:\Program Files (x86)\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll, env.DRIVERDATA=C:\Windows\System32\Drivers\DriverData, env.SYSTEMDRIVE=C:, env.JVMCONFIG=\.mvn\jvm.config, env.PROGRAMFILES(X86)=C:\Program Files (x86), env.PROCESSOR_LEVEL=6, env.HOMEDRIVE=U:, env.PSMODULEPATH=C:\Program Files\WindowsPowerShell\Modules;C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules, env.TMP=C:\Users\KOF4BE\AppData\Local\Temp, project.reporting.outputEncoding=UTF-8, sun.os.patch.level=, maven.compiler.target=1.8, env.LOGONSERVER=\\BE-BCD02, env.ML_FORK=0, library.jansi.path=C:\Program Files (x86)\apache-maven-3.6.1\bin\..\lib\jansi-native, env.WINDIR=C:\WINDOWS, env.USERDNSDOMAIN=DE.BOSCH.COM, java.class.path=C:\Program Files (x86)\apache-maven-3.6.1\bin\..\boot\plexus-classworlds-2.6.0.jar, env.PKIDATA=\\be60fs02.de.bosch.com\KOF4BE$, env.SNC_LIB=C:\Program Files (x86)\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll, java.vm.vendor=Oracle Corporation, env.PROCESSOR_ARCHITECTURE=AMD64, env.ML_DP_SERVER_CL=\\SI0VM1385.de.bosch.com\eForms\WTS\BOS600, user.variant=, env.COMMONPROGRAMFILES=C:\Program Files\Common Files, maven.conf=C:\Program Files (x86)\apache-maven-3.6.1\bin\../conf, sun.java.launcher=SUN_STANDARD, user.country=US, env.USERDOMAIN=DE, env.COMSPEC=C:\WINDOWS\system32\cmd.exe, sun.cpu.endian=little, user.language=en, env.ASL.LOG=Destination=file, env.JAVACMD=C:\Program Files\Java\jdk-12.0.1\bin\java.exe, env.SCALA_HOME=C:\Program Files (x86)\scala\bin, scala.version=2.11.8, env.MAVEN_HOME=C:\Program Files (x86)\apache-maven-3.6.1\bin\.., env.INTELLIJ IDEA=C:\Program Files\JetBrains\IntelliJ IDEA 2019.1.1\bin;, env.WDIR=C:\, env.SSF_LIBRARY_PATH_64=C:\Program Files\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll, env.=C:=C:\Users\KOF4BE\Downloads\spark-monitoring-master\src, java.runtime.name=Java(TM) SE Runtime Environment, scala.compat.version=2.11, project.build.sourceEncoding=UTF-8, env.COMMONPROGRAMFILES(X86)=C:\Program Files (x86)\Common Files, env.MAVEN_CMD_LINE_ARGS=clean install -e -X, java.vendor.url.bug=https://bugreport.java.com/bugreport/, env.HTTPS_PROXY=http://localhost:3128, user.dir=C:\Users\KOF4BE\Downloads\spark-monitoring-master\src, java.vm.version=12.0.1+12} [INFO] Using 'UTF-8' encoding to copy filtered resources. [DEBUG] resource with targetPath null directory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\resources excludes [] includes [] [INFO] skip non existing resourceDirectory C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\resources [DEBUG] no use filter components [INFO] [INFO] --- scala-maven-plugin:3.4.2:compile (default) @ spark-listeners --- [DEBUG] Configuring mojo net.alchim31.maven:scala-maven-plugin:3.4.2:compile from plugin realm ClassRealm[plugin>net.alchim31.maven:scala-maven-plugin:3.4.2, parent: jdk.internal.loader.ClassLoaders$AppClassLoader@c387f44] [DEBUG] Configuring mojo 'net.alchim31.maven:scala-maven-plugin:3.4.2:compile' with basic configurator --> [DEBUG] (f) analysisCacheFile = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\analysis\compile [DEBUG] (f) args = [-target:jvm-1.8, -feature, -dependencyfile, C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target/.scala_dependencies] [DEBUG] (f) checkMultipleScalaVersions = true [DEBUG] (f) compileOrder = mixed [DEBUG] (f) displayCmd = false [DEBUG] (f) encoding = UTF-8 [DEBUG] (f) failOnMultipleScalaVersions = false [DEBUG] (f) forceUseArgFile = false [DEBUG] (f) fork = true [DEBUG] (f) javacArgs = [-source, 1.8, -target, 1.8] [DEBUG] (f) javacGenerateDebugSymbols = true [DEBUG] (f) localRepo = id: local url: file:///C:/Users/KOF4BE/.m2/repository/ layout: default snapshots: [enabled => true, update => always] releases: [enabled => true, update => always]

[DEBUG] (f) localRepository = id: local
url: file:///C:/Users/KOF4BE/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG] (f) notifyCompilation = true
[DEBUG] (f) outputDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\target\classes
[DEBUG] (f) pluginArtifacts = [net.alchim31.maven:scala-maven-plugin:maven-plugin:3.4.2:, org.apache.maven:maven-compat:jar:3.5.4:compile, org.apache.maven:maven-model-builder:jar:3.5.4:compile, org.apache.maven:maven-settings:jar:3.5.4:compile, org.apache.maven:maven-settings-builder:jar:3.5.4:compile, org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4:compile, org.sonatype.plexus:plexus-cipher:jar:1.4:compile, org.apache.maven:maven-artifact:jar:3.5.4:compile, org.apache.maven:maven-resolver-provider:jar:3.5.4:compile, org.apache.maven:maven-repository-metadata:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-api:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-util:jar:1.1.1:compile, org.apache.maven.resolver:maven-resolver-impl:jar:1.1.1:compile, org.codehaus.plexus:plexus-interpolation:jar:1.24:compile, org.eclipse.sisu:org.eclipse.sisu.plexus:jar:0.3.3:compile, javax.enterprise:cdi-api:jar:1.0:compile, javax.annotation:jsr250-api:jar:1.0:compile, org.codehaus.plexus:plexus-component-annotations:jar:1.7.1:compile, org.apache.maven.wagon:wagon-provider-api:jar:3.1.0:compile, org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile, org.apache.maven:maven-core:jar:3.5.4:compile, org.apache.maven:maven-builder-support:jar:3.5.4:compile, org.apache.maven:maven-plugin-api:jar:3.5.4:compile, org.apache.maven.resolver:maven-resolver-spi:jar:1.1.1:compile, org.apache.maven.shared:maven-shared-utils:jar:3.2.1:compile, org.eclipse.sisu:org.eclipse.sisu.inject:jar:0.3.3:compile, com.google.inject:guice:jar:no_aop:4.2.0:compile, aopalliance:aopalliance:jar:1.0:compile, com.google.guava:guava:jar:20.0:compile, javax.inject:javax.inject:jar:1:compile, org.apache.commons:commons-lang3:jar:3.5:compile, org.apache.maven.shared:maven-dependency-tree:jar:3.0.1:compile, org.eclipse.aether:aether-util:jar:0.9.0.M2:compile, org.apache.commons:commons-exec:jar:1.3:compile, org.codehaus.plexus:plexus-utils:jar:3.1.0:compile, org.codehaus.plexus:plexus-archiver:jar:3.6.0:compile, org.codehaus.plexus:plexus-io:jar:3.0.1:compile, org.apache.commons:commons-compress:jar:1.16.1:compile, org.objenesis:objenesis:jar:2.6:compile, org.iq80.snappy:snappy:jar:0.4:compile, org.tukaani:xz:jar:1.8:runtime, org.codehaus.plexus:plexus-classworlds:jar:2.5.2:compile, org.apache.maven:maven-project:jar:2.2.1:compile, org.apache.maven:maven-profile:jar:2.2.1:compile, org.apache.maven:maven-artifact-manager:jar:2.2.1:compile, backport-util-concurrent:backport-util-concurrent:jar:3.1:compile, org.apache.maven:maven-plugin-registry:jar:2.2.1:compile, org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile, junit:junit:jar:3.8.1:compile, classworlds:classworlds:jar:1.1-alpha-2:compile, org.apache.maven:maven-archiver:jar:3.2.0:compile, commons-io:commons-io:jar:2.5:compile, org.apache.maven.doxia:doxia-sink-api:jar:1.8:compile, org.apache.maven.doxia:doxia-logging-api:jar:1.8:compile, org.apache.maven:maven-model:jar:3.5.4:compile, org.apache.maven.shared:maven-invoker:jar:3.0.1:compile, com.typesafe.zinc:zinc:jar:0.3.15:compile, org.scala-lang:scala-library:jar:2.10.6:compile, com.typesafe.sbt:incremental-compiler:jar:0.13.15:compile, org.scala-lang:scala-compiler:jar:2.10.6:compile, org.scala-lang:scala-reflect:jar:2.10.6:compile, com.typesafe.sbt:sbt-interface:jar:0.13.15:compile, com.typesafe.sbt:compiler-interface:jar:sources:0.13.15:compile]
[DEBUG] (f) project = MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml
[DEBUG] (f) reactorProjects = [MavenProject: com.microsoft.pnp:spark-monitoring:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\pom.xml, MavenProject: com.microsoft.pnp:spark-jobs:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-jobs\pom.xml, MavenProject: com.microsoft.pnp:spark-listeners-loganalytics:1.0-SNAPSHOT @ C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners-loganalytics\pom.xml]
[DEBUG] (f) recompileMode = incremental
[DEBUG] (f) remoteRepos = [ id: bci-mvn
url: https://rb-artifactory.bosch.com/artifactory/bci-mvn-virtual
layout: default
snapshots: [enabled => true, update => daily]
releases: [enabled => true, update => daily]
, id: central
url: https://repo.maven.apache.org/maven2
layout: default
snapshots: [enabled => false, update => daily]
releases: [enabled => true, update => daily]
]
[DEBUG] (f) scalaClassName = scala.tools.nsc.Main
[DEBUG] (f) scalaCompatVersion = 2.11
[DEBUG] (f) scalaOrganization = org.scala-lang
[DEBUG] (f) scalaVersion = 2.11.8
[DEBUG] (f) sendJavaToScalac = true
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@2e807c54
[DEBUG] (f) source = 1.8
[DEBUG] (f) sourceDir = C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\java..\scala
[DEBUG] (f) target = 1.8
[DEBUG] (f) useCanonicalPath = true
[DEBUG] (f) useZincServer = false
[DEBUG] (f) zincHost = 127.0.0.1
[DEBUG] (f) zincPort = 3030
[DEBUG] -- end configuration --
[DEBUG] Checking for multiple versions of scala
[DEBUG] building maven31 dependency graph for com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT with Maven31DependencyGraphBuilder
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=1944302, ConflictMarker.markTime=1227611, ConflictMarker.nodeCount=2338, ConflictIdSorter.graphTime=662683, ConflictIdSorter.topsortTime=103696, ConflictIdSorter.conflictIdCount=169, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=5743253, ConflictResolver.conflictItemCount=467, DefaultDependencyCollector.collectTime=13431348, DefaultDependencyCollector.transformTime=9699908}
[DEBUG] com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT
[DEBUG] org.scala-lang:scala-library:jar:2.11.8:provided
[DEBUG] org.mockito:mockito-core:jar:1.10.19:test
[DEBUG] org.hamcrest:hamcrest-core:jar:1.1:test
[DEBUG] org.objenesis:objenesis:jar:2.1:provided
[DEBUG] org.apache.spark:spark-core_2.11:jar:2.4.0:provided
[DEBUG] org.apache.avro:avro:jar:1.8.2:provided
[DEBUG] org.codehaus.jackson:jackson-core-asl:jar:1.9.13:provided
[DEBUG] org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:provided
[DEBUG] com.thoughtworks.paranamer:paranamer:jar:2.7:provided
[DEBUG] org.apache.commons:commons-compress:jar:1.8.1:provided
[DEBUG] org.tukaani:xz:jar:1.5:provided
[DEBUG] org.apache.avro:avro-mapred:jar:hadoop2:1.8.2:provided
[DEBUG] org.apache.avro:avro-ipc:jar:1.8.2:provided
[DEBUG] commons-codec:commons-codec:jar:1.10:provided (scope managed from compile) (version managed from 1.9)
[DEBUG] com.twitter:chill_2.11:jar:0.9.3:provided
[DEBUG] com.esotericsoftware:kryo-shaded:jar:4.0.2:provided
[DEBUG] com.esotericsoftware:minlog:jar:1.3.0:provided
[DEBUG] com.twitter:chill-java:jar:0.9.3:provided
[DEBUG] org.apache.xbean:xbean-asm6-shaded:jar:4.8:provided
[DEBUG] org.apache.hadoop:hadoop-client:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-common:jar:2.6.5:provided
[DEBUG] commons-cli:commons-cli:jar:1.2:provided
[DEBUG] xmlenc:xmlenc:jar:0.52:provided
[DEBUG] commons-httpclient:commons-httpclient:jar:3.1:provided
[DEBUG] commons-io:commons-io:jar:2.4:provided
[DEBUG] commons-collections:commons-collections:jar:3.2.2:provided
[DEBUG] commons-configuration:commons-configuration:jar:1.6:provided
[DEBUG] commons-digester:commons-digester:jar:1.8:provided
[DEBUG] commons-beanutils:commons-beanutils:jar:1.7.0:provided
[DEBUG] commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
[DEBUG] com.google.code.gson:gson:jar:2.2.4:provided
[DEBUG] org.apache.hadoop:hadoop-auth:jar:2.6.5:provided
[DEBUG] org.apache.httpcomponents:httpclient:jar:4.5.4:provided (scope managed from compile) (version managed from 4.2.5)
[DEBUG] org.apache.httpcomponents:httpcore:jar:4.4.7:provided
[DEBUG] org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15:provided
[DEBUG] org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15:provided
[DEBUG] org.apache.directory.api:api-asn1-api:jar:1.0.0-M20:provided
[DEBUG] org.apache.directory.api:api-util:jar:1.0.0-M20:provided
[DEBUG] org.apache.curator:curator-client:jar:2.6.0:provided
[DEBUG] org.htrace:htrace-core:jar:3.0.4:provided
[DEBUG] org.apache.hadoop:hadoop-hdfs:jar:2.6.5:provided
[DEBUG] org.mortbay.jetty:jetty-util:jar:6.1.26:provided
[DEBUG] xerces:xercesImpl:jar:2.9.1:provided
[DEBUG] xml-apis:xml-apis:jar:1.3.04:provided
[DEBUG] org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-yarn-client:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-yarn-server-common:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-yarn-api:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-yarn-common:jar:2.6.5:provided
[DEBUG] javax.xml.bind:jaxb-api:jar:2.2.2:provided
[DEBUG] javax.xml.stream:stax-api:jar:1.0-2:provided
[DEBUG] org.codehaus.jackson:jackson-jaxrs:jar:1.9.13:provided
[DEBUG] org.codehaus.jackson:jackson-xc:jar:1.9.13:provided
[DEBUG] org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.6.5:provided
[DEBUG] org.apache.hadoop:hadoop-annotations:jar:2.6.5:provided
[DEBUG] org.apache.spark:spark-launcher_2.11:jar:2.4.0:provided
[DEBUG] org.apache.spark:spark-kvstore_2.11:jar:2.4.0:provided
[DEBUG] org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:provided
[DEBUG] com.fasterxml.jackson.core:jackson-core:jar:2.6.7:provided
[DEBUG] com.fasterxml.jackson.core:jackson-annotations:jar:2.6.7:provided
[DEBUG] org.apache.spark:spark-network-common_2.11:jar:2.4.0:provided
[DEBUG] org.apache.spark:spark-network-shuffle_2.11:jar:2.4.0:provided
[DEBUG] org.apache.spark:spark-unsafe_2.11:jar:2.4.0:provided
[DEBUG] javax.activation:activation:jar:1.1.1:provided
[DEBUG] org.apache.curator:curator-recipes:jar:2.6.0:provided
[DEBUG] org.apache.curator:curator-framework:jar:2.6.0:provided
[DEBUG] com.google.guava:guava:jar:16.0.1:provided
[DEBUG] org.apache.zookeeper:zookeeper:jar:3.4.6:provided
[DEBUG] javax.servlet:javax.servlet-api:jar:3.1.0:provided
[DEBUG] org.apache.commons:commons-lang3:jar:3.5:provided
[DEBUG] org.apache.commons:commons-math3:jar:3.4.1:provided
[DEBUG] com.google.code.findbugs:jsr305:jar:1.3.9:provided
[DEBUG] org.slf4j:jul-to-slf4j:jar:1.7.16:provided
[DEBUG] org.slf4j:jcl-over-slf4j:jar:1.7.16:provided
[DEBUG] org.slf4j:slf4j-log4j12:jar:1.7.16:provided
[DEBUG] com.ning:compress-lzf:jar:1.0.3:provided
[DEBUG] org.xerial.snappy:snappy-java:jar:1.1.7.1:provided
[DEBUG] org.lz4:lz4-java:jar:1.4.0:provided
[DEBUG] com.github.luben:zstd-jni:jar:1.3.2-2:provided
[DEBUG] org.roaringbitmap:RoaringBitmap:jar:0.5.11:provided
[DEBUG] commons-net:commons-net:jar:3.1:provided
[DEBUG] org.json4s:json4s-jackson_2.11:jar:3.5.3:provided
[DEBUG] org.json4s:json4s-core_2.11:jar:3.5.3:provided
[DEBUG] org.json4s:json4s-ast_2.11:jar:3.5.3:provided
[DEBUG] org.json4s:json4s-scalap_2.11:jar:3.5.3:provided
[DEBUG] org.glassfish.jersey.core:jersey-client:jar:2.22.2:provided
[DEBUG] javax.ws.rs:javax.ws.rs-api:jar:2.0.1:provided
[DEBUG] org.glassfish.hk2:hk2-api:jar:2.4.0-b34:provided
[DEBUG] org.glassfish.hk2:hk2-utils:jar:2.4.0-b34:provided
[DEBUG] org.glassfish.hk2.external:aopalliance-repackaged:jar:2.4.0-b34:provided
[DEBUG] org.glassfish.hk2.external:javax.inject:jar:2.4.0-b34:provided
[DEBUG] org.glassfish.hk2:hk2-locator:jar:2.4.0-b34:provided
[DEBUG] org.javassist:javassist:jar:3.18.1-GA:provided
[DEBUG] org.glassfish.jersey.core:jersey-common:jar:2.22.2:provided
[DEBUG] javax.annotation:javax.annotation-api:jar:1.2:provided
[DEBUG] org.glassfish.jersey.bundles.repackaged:jersey-guava:jar:2.22.2:provided
[DEBUG] org.glassfish.hk2:osgi-resource-locator:jar:1.0.1:provided
[DEBUG] org.glassfish.jersey.core:jersey-server:jar:2.22.2:provided
[DEBUG] org.glassfish.jersey.media:jersey-media-jaxb:jar:2.22.2:provided
[DEBUG] javax.validation:validation-api:jar:1.1.0.Final:provided
[DEBUG] org.glassfish.jersey.containers:jersey-container-servlet:jar:2.22.2:provided
[DEBUG] org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.22.2:provided
[DEBUG] io.netty:netty-all:jar:4.1.17.Final:provided
[DEBUG] io.netty:netty:jar:3.9.9.Final:provided
[DEBUG] com.clearspring.analytics:stream:jar:2.7.0:provided
[DEBUG] io.dropwizard.metrics:metrics-jvm:jar:3.1.5:provided
[DEBUG] io.dropwizard.metrics:metrics-json:jar:3.1.5:provided (scope managed from compile) (version managed from 3.1.5)
[DEBUG] io.dropwizard.metrics:metrics-graphite:jar:3.1.5:provided
[DEBUG] com.fasterxml.jackson.core:jackson-databind:jar:2.6.7.1:provided (scope managed from compile) (version managed from 2.6.7.1)
[DEBUG] com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.6.7.1:provided
[DEBUG] com.fasterxml.jackson.module:jackson-module-paranamer:jar:2.7.9:provided
[DEBUG] org.apache.ivy:ivy:jar:2.4.0:provided
[DEBUG] oro:oro:jar:2.0.8:provided
[DEBUG] net.razorvine:pyrolite:jar:4.13:provided
[DEBUG] net.sf.py4j:py4j:jar:0.10.7:provided
[DEBUG] org.apache.spark:spark-tags_2.11:jar:2.4.0:provided
[DEBUG] org.apache.commons:commons-crypto:jar:1.0.0:provided
[DEBUG] org.spark-project.spark:unused:jar:1.0.0:provided
[DEBUG] org.apache.spark:spark-sql_2.11:jar:2.4.0:provided
[DEBUG] com.univocity:univocity-parsers:jar:2.7.3:provided
[DEBUG] org.apache.spark:spark-sketch_2.11:jar:2.4.0:provided
[DEBUG] org.apache.spark:spark-catalyst_2.11:jar:2.4.0:provided
[DEBUG] org.codehaus.janino:janino:jar:3.0.9:provided
[DEBUG] org.codehaus.janino:commons-compiler:jar:3.0.9:provided
[DEBUG] org.antlr:antlr4-runtime:jar:4.7:provided
[DEBUG] org.apache.orc:orc-core:jar:nohive:1.5.2:provided
[DEBUG] org.apache.orc:orc-shims:jar:1.5.2:provided
[DEBUG] com.google.protobuf:protobuf-java:jar:2.5.0:provided
[DEBUG] commons-lang:commons-lang:jar:2.6:provided
[DEBUG] io.airlift:aircompressor:jar:0.10:provided
[DEBUG] org.apache.orc:orc-mapreduce:jar:nohive:1.5.2:provided
[DEBUG] org.apache.parquet:parquet-column:jar:1.10.0:provided
[DEBUG] org.apache.parquet:parquet-common:jar:1.10.0:provided
[DEBUG] org.apache.parquet:parquet-encoding:jar:1.10.0:provided
[DEBUG] org.apache.parquet:parquet-hadoop:jar:1.10.0:provided
[DEBUG] org.apache.parquet:parquet-format:jar:2.4.0:provided
[DEBUG] org.apache.parquet:parquet-jackson:jar:1.10.0:provided
[DEBUG] org.apache.arrow:arrow-vector:jar:0.10.0:provided
[DEBUG] org.apache.arrow:arrow-format:jar:0.10.0:provided
[DEBUG] org.apache.arrow:arrow-memory:jar:0.10.0:provided
[DEBUG] joda-time:joda-time:jar:2.9.9:provided
[DEBUG] com.carrotsearch:hppc:jar:0.7.2:provided
[DEBUG] com.vlkan:flatbuffers:jar:1.2.0-3f79e055:provided
[DEBUG] org.apache.spark:spark-core_2.11:jar:tests:2.4.0:test
[DEBUG] org.apache.spark:spark-streaming_2.11:jar:2.4.0:provided
[DEBUG] org.slf4j:slf4j-api:jar:1.7.7:provided
[DEBUG] log4j:log4j:jar:1.2.17:provided
[DEBUG] io.dropwizard.metrics:metrics-core:jar:3.1.5:provided
[DEBUG] org.eclipse.jetty:jetty-server:jar:9.3.20.v20170531:provided
[DEBUG] org.eclipse.jetty:jetty-http:jar:9.3.20.v20170531:provided
[DEBUG] org.eclipse.jetty:jetty-util:jar:9.3.20.v20170531:provided
[DEBUG] org.eclipse.jetty:jetty-io:jar:9.3.20.v20170531:provided
[DEBUG] com.github.dwickern:scala-nameof_2.11:jar:1.0.3:provided
[DEBUG] org.scala-lang:scala-reflect:jar:2.11.8:provided
[DEBUG] junit:junit:jar:4.12:test
[DEBUG] org.scalatest:scalatest_2.11:jar:3.0.3:test
[DEBUG] org.scalactic:scalactic_2.11:jar:3.0.3:test
[DEBUG] org.scala-lang.modules:scala-xml_2.11:jar:1.0.5:provided
[DEBUG] org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:provided
[DEBUG] checking [com.microsoft.pnp:spark-listeners:jar:1.0-SNAPSHOT] for scala version
[DEBUG] checking [org.scala-lang:scala-library:jar:2.11.8:provided] for scala version
[DEBUG] C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\java
[DEBUG] C:\Users\KOF4BE\Downloads\spark-monitoring-master\src\spark-listeners\src\main\scala
[DEBUG] includes = [/*.java,/*.scala,]
[DEBUG] excludes = []
[INFO] Using incremental compilation
[DEBUG] Setup = {
[DEBUG] scala compiler = C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-compiler\2.11.8\scala-compiler-2.11.8.jar
[DEBUG] scala library = C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.8\scala-library-2.11.8.jar
[DEBUG] scala extra = {
[DEBUG] C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.6\scala-library-2.11.6.jar
[DEBUG] C:\Users\KOF4BE.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.4\scala-parser-combinators_2.11-1.0.4.jar
[DEBUG] C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.4\scala-library-2.11.4.jar
[DEBUG] C:\Users\KOF4BE.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.4\scala-xml_2.11-1.0.4.jar
[DEBUG] C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-reflect\2.11.8\scala-reflect-2.11.8.jar
[DEBUG] }
[DEBUG] sbt interface = C:\Users\KOF4BE.m2\repository\com\typesafe\sbt\sbt-interface\0.13.15\sbt-interface-0.13.15.jar
[DEBUG] compiler interface sources = C:\Users\KOF4BE.m2\repository\com\typesafe\sbt\compiler-interface\0.13.15\compiler-interface-0.13.15-sources.jar
[DEBUG] java home =
[DEBUG] fork java = false
[DEBUG] cache directory = C:\Users\KOF4BE.zinc\0.3.15
[DEBUG] }
[INFO] 'compiler-interface' not yet compiled for Scala 2.11.8. Compiling...
[DEBUG] Plain interface to Scala compiler 2.11.8 with arguments:
-nowarn
-d
C:\Users\KOF4BE\AppData\Local\Temp\sbt_9e83bfb7
-bootclasspath
C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.8\scala-library-2.11.8.jar
-classpath
C:\Users\KOF4BE.m2\repository\com\typesafe\sbt\sbt-interface\0.13.15\sbt-interface-0.13.15.jar;C:\Users\KOF4BE.m2\repository\com\typesafe\sbt\compiler-interface\0.13.15\compiler-interface-0.13.15-sources.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-compiler\2.11.8\scala-compiler-2.11.8.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.6\scala-library-2.11.6.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.4\scala-parser-combinators_2.11-1.0.4.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-library\2.11.4\scala-library-2.11.4.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.4\scala-xml_2.11-1.0.4.jar;C:\Users\KOF4BE.m2\repository\org\scala-lang\scala-reflect\2.11.8\scala-reflect-2.11.8.jar
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\API.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Analyzer.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Command.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Compat.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\CompilerInterface.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\ConsoleInterface.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\DelegatingReporter.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Dependency.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\ExtractAPI.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\ExtractUsedNames.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\GlobalHelpers.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\LocateClassFile.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Log.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\Message.scala
C:\Users\KOF4BE\AppData\Local\Temp\sbt_3bbc25d0\xsbt\ScaladocInterface.scala
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for spark-monitoring 1.0-SNAPSHOT:
[INFO]
[INFO] spark-monitoring ................................... SUCCESS [ 3.623 s]
[INFO] spark-listeners .................................... FAILURE [ 5.928 s]
[INFO] spark-jobs ......................................... SKIPPED
[INFO] spark-listeners-loganalytics ....................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9.709 s
[INFO] Finished at: 2019-06-17T15:02:56+02:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile (default) on project spark-listeners: Execution default of goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile failed.: CompileFailed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile (default) on project spark-listeners: Execution default of goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:567)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution default of goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:148)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:567)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: sbt.compiler.CompileFailed
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply (AnalyzingCompiler.scala:158)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply (AnalyzingCompiler.scala:155)
at sbt.IO$.withTemporaryDirectory (IO.scala:358)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply (AnalyzingCompiler.scala:155)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply (AnalyzingCompiler.scala:152)
at sbt.IO$.withTemporaryDirectory (IO.scala:358)
at sbt.compiler.AnalyzingCompiler$.compileSources (AnalyzingCompiler.scala:152)
at sbt.compiler.IC$.compileInterfaceJar (IncrementalCompiler.scala:58)
at com.typesafe.zinc.Compiler$.compilerInterface (Compiler.scala:154)
at com.typesafe.zinc.Compiler$.create (Compiler.scala:55)
at com.typesafe.zinc.Compiler.create (Compiler.scala)
at sbt_inc.SbtIncrementalCompiler. (SbtIncrementalCompiler.java:67)
at scala_maven.ScalaCompilerSupport.incrementalCompile (ScalaCompilerSupport.java:311)
at scala_maven.ScalaCompilerSupport.compile (ScalaCompilerSupport.java:136)
at scala_maven.ScalaCompilerSupport.doExecute (ScalaCompilerSupport.java:116)
at scala_maven.ScalaMojoSupport.execute (ScalaMojoSupport.java:574)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:567)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :spark-listeners

spark-monitoring Library is not working for the 6.1, 6.2, 6.3 Databricks Runtime Versions

When I used this library on 6.0 data bricks runtime version then it is working fine but I want to use the latest version 6.3 with this library. I am facing an error in the init script get failed while the cluster is up.

Event Logs:
TERMINATING
2020-01-28 11:22:28 IST
Cluster terminated. Reason: Init Script Failure
INIT_SCRIPTS_FINISHED
2020-01-28 11:22:27 IST
Finished Init Scripts execution.
INIT_SCRIPTS_STARTED
2020-01-28 11:22:26 IST
Starting Init Scripts execution.

After Init script finished its execution, it failed.

The reason is:

Message Cluster terminated. Reason: Init Script Failure Init script dbfs:/databricks/spark-monitoring/spark-monitoring.sh failed: Script exit status is non-zero

The same script is working fine in the 5.5 and 6.0 data bricks runtime version.

Please Resolve this Issue soon.

Writing less streaming data

We are monitoring structured streaming using the spark monitoring library. I am interested in progress events only. I set the log4j logging level to Info ("log4j.logger.org.apache.spark.sql.execution.streaming.StreamExecution=INFO"). This is still writing a lot of data. Is it possible to restrict this to progress events only? (i.e. QueryProgressEvent)

Thanks.

image

Cannot copy JAR files when Cluster running with a custom Docker image

Hello,

I am working with Databricks and am looking to set up monitoring by collecting metrics into Azure Log Analytics, that's exactly what you are suggesting. After some issues I've finally managed to set it up on my DEV databricks using a basic cluster & default run time, no issue it's collecting the metrics however when trying with a more complex cluster using a custom docker image the init script is failing at the very begining.

Here is my cluster:
image

The Init Script configuration:
image

After starting and finally stopped in error that's the error found in the cluster-logs:

cp: cannot create regular file '/mnt/driver-daemon/jars': No such file or directory

It's happening in dbfs:/databricks/spark-monitoring/spark-monitoring.sh at line 63 when trying to copy the JAR files to the cluster:

cp -f "$STAGE_DIR/$JAR_FILENAME" /mnt/driver-daemon/jars

It seems this folder do not exist in the docker image I'm using, is there a way to bypass this? Is there a way to use another folder? If yes what other files shall I update to ensure this new folder will be used to use the JAR?

Thank you for your help

I am not able to build

I tried to build the library today and it gives me Service 'sparkDriver' could not bind to a random port.
I am attaching the screen.
I was able to successfully build some time before Aug 16.
Can someone
Capture
Capture1
Capture2

please look into this.

Thank you

Docker based build statement for sample project not working on windows

Command:
docker run -it --rm -v %cd%/spark-monitoring/sample/spark-sample-job:/spark-monitoring -v "%USERPROFILE%/.m2":/root/.m2 maven:3.6.1-jdk-8 mvn clean install -P scala-2.11_spark-2.4.3

From(tried):
..
..\spark-monitoring
..\spark-monitoring\sample\spark-sample-job

Output:
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.115 s
[INFO] Finished at: 2020-03-03T23:31:18Z
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "scala-2.11_spark-2.4.3" could not be activated because it does not exist.
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException

Can you please help me identify what should be changed? It seems that maven is not able to find the pom file.

Build error while generating Jar file

I am not able to generate Jar file for spark-listeners-loganalytics.
I am getting below two error. Can you please help me here

Error:(6, 8) object SparkListenerSink is not a member of package org.apache.spark.listeners.sink
import org.apache.spark.listeners.sink.SparkListenerSink

Error:(11, 8) object SparkInformation is not a member of package org.apache.spark
import org.apache.spark.SparkInformation

Log4j appender produces StackOverflowErrors when debug logging is turned on

When any debug logging is turned on the log4j appender produces a StackOverflowError

This is caused by the appender and it's dependent classes using log4j to log statements during the setup of the appender. This causes and endless loop and so the StackOverFlowError

I believe the correct way to log during log4j initialization is via the LogLog class which outputs to System.out

Spark monitoring script compatibility issues with Data bricks run time version beyond 6.3

Hi,

We had added the spark monitoring init script within our workspace for interactive and automated clusters which was running with runtime version of 6.3 (includes Apache Spark 2.4.4, Scala 2.11). But we now have issues with this runtime version since, this cluster's runtime version is out of support. We tried to change the run time version of the cluster instance beyond 6.3 and upon adding the init scripts we could see that the cluster start up time is more than expected.

Seeking your help in identifying this issue.

Regards,

Rakesh.N

Data for some attributes seem to be cut in Log Analytics Workspace

I am trying to retrieve a report on the SQL queries that have been executed on a given cluster. What I noticed, however, when running querie agains the Log Analytics Workspace is that longer SQL queries are cut and followed by "..." at the end. Attribute that houses this information is Properties_spark_job_description_s.

Can you please help me understand how this can be modified to report the full query?

Logging to Azure Log Analytics workspace through Scala Notebook

This did not work for me through Scala Notebook. I see cluster logs going to Log Analytics workspace, but not the notebook logs. Steps followed fir DBR Runtime 5.5 LTS:

Created a new cluster to run listeners.sh as Init Script
Configured logging in notebook:
// Configure our logging
TryWith(getClass.getResourceAsStream("/log4j.properties")) {
stream => {
Log4jConfiguration.configure(stream)
}
}
Logged events which did not come up in the Log Analytics workspace:
val logger = LoggerFactory.getLogger(getClass)
logger.info("Info message")
logger.warn("Warn message")
logger.error("Error message")
Am I missing something here? Would appreciate any help.

Failure building the maven project.

I am very new in this and I am still learning about IT. I am trying to follow all the steps but I am dealing with an error in building the Jar files and I cannot find the answer. If someone could help me I will be very glad.

I am executing the maven package build phase in Intellij but the building is failing. The error says "Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile (default) on project spark-listeners: Execution default of goal net.alchim31.maven:scala-maven-plugin:3.4.2:compile failed.: CompileFailed ".

I also have this error about compiler in mirror not found:
Untitled err

Grafana dashboard panel is showing Request Error

I am using Azure instance of Grafana, and i am trying to setup monitoring for Azure databricks using grafana dashboards. I have used the link : https://docs.microsoft.com/bs-latn-ba/azure/architecture/databricks-monitoring/dashboards to setup the grafana and its dashboard. But when i completed the setup, the panels in the Grafana dashboard is showing “Request Error” on top it. When i looked in to the query of the panel, its showing "TypeError: undefined is not iterable (cannot read property Symbol(Symbol.iterator)) as error.
The syntax error which i could find from the browser console is :
'extend' operator: Failed to resolve scalar expression named 'Properties_spark_metric_namespace_s'
Grafana

Feature Request: Azure DevOps Yaml Pipeline

The instructions in the readme are helpful to build the monitoring library to start using it. It would be great if there was an Azure DevOps Yaml pipeline definition file to help those who want to use the library in Azure Databricks but may not be very familiar with building Scala on their local machine.

If this is a useful contribution, I would be willing to create the yaml file and update the readme doc with update instructions.

Additional LogAnalyticsAppender not working for automated clusters

I have an issue getting spark-monitoring to work in automated clusters running a python notebook job. The python notebook has both python code and some scala code running in seperate %scala command. Both python and scala code needs to log events that should be forwarded to log analytics.

Using the default cluster init script from this repo, and configuring it like in the README.md works as expected.

But I have tweaked the log4j config in the cluster init script spark-monitoring.sh like in this code snippet

...
tee -a ${LOG4J_CONFIG_FILE} << EOF
# logAnalytics
log4j.appender.logAnalyticsAppender=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.logAnalyticsAppender.filter.spark=com.microsoft.pnp.logging.SparkPropertyEnricher
log4j.appender.logAnalyticsAppender.Threshold=WARN

# pipelineLog
log4j.logger.pipelineLog=INFO, pipelineAppender
log4j.additivity.pipelineLog=false

# pipelineAppender
log4j.appender.pipelineAppender=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.pipelineAppender.filter.spark=com.microsoft.pnp.logging.SparkPropertyEnricher
log4j.appender.pipelineAppender.layout=com.microsoft.pnp.logging.JSONLayout
log4j.appender.pipelineAppender.layout.LocationInfo=false
log4j.appender.pipelineAppender.logType=PipelineLoggingEvent
log4j.appender.pipelineAppender.Threshold=INFO
EOF

echo "END: Updating $LOG4J_CONFIG_FILE with Log Analytics appender"
...

Basically I want to only forward log events of log level WARN to log analytics by setting a Threshold on the default logAnalyticsAppender used by the rootCategory logger. Then I want a seperate logger called pipelineLog which forwards log events of log level INFO to a new specific table called PipelingLoggingEvent in log analytics.

The log4j config shown in the above snippet works fine when running the notebook in an interactive cluster, but when scheduling it using jobs and automated clusters it does not work. Nothing gets send when using the pipelineLog logger from either python or scala code.

Both the interactive cluster and the automated clusters are using the revised cluster init script and the Databricks 5.5 LTS runtime.

Hope to get some insight into why interactive clusters and automated clusters behave differently with this log4j config.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.