GithubHelp home page GithubHelp logo

salesforce / argus Goto Github PK

View Code? Open in Web Editor NEW
504.0 45.0 148.0 9.15 MB

Time series monitoring and alerting platform.

License: BSD 3-Clause "New" or "Revised" License

Shell 0.05% Java 89.72% JavaScript 8.17% CSS 0.31% HTML 1.73% Dockerfile 0.01%
argus timeseries-database alerting dashboards metrics high-throughput low-latency

argus's Introduction

Argus Build Status Static Analysis argus-sdk - Maven Central

Argus is a time-series monitoring and alerting platform. It consists of discrete services to configure alerts, ingest and transform metrics & events, send notifications, create namespaces, and to both establish and enforce policies and quotas for usage.

Its architecture allows any and all of these services to be retargeted to new technology as it becomes available, with little to no impact on the users.

To find out more see the wiki and check out the release notes.

Argus UI

Building Argus

Installing The Resource Filters

Argus uses the argus-build.properties file as a resource filter for the build and all the module builds. After you clone the project for the first time, or after you change this file, you must create and install the dependency jars which will contain these filters. Those dependency jars are then pulled in by the modules, expanded and have their values applied to the module specific builds. Luckily, it's a straightforward operation. Just execute the following command from within the parent project, after you first clone the project or after you update the argus-build.properties file.

mvn -DskipTests=true -DskipDockerBuild --non-recursive install

Running The Unit Tests

Once the resource filters are installed, you can run unit tests. Running the unit tests doesn't require any changes to the argus-build.properties file. Just install the resource filters and execute the test goal.

mvn test

Only the unit tests are run by codecov.io and as such, the coverage reported by it is significantly less than the coverage obtained by running the full test suite.

Running The Integration Tests (Deprecated)

Currently the integration tests are non-functional but are being retained in case we want to resurrect them.

The integration tests for Argus use the LDAPAuthService implementation of the AuthService interface and the DefaultTSDBService implementation of the TSDBService interface (which targets OpenTSDB). Additionally it uses the RedisCacheService implementation of the CacheService interface to facilitate integration testing of the BatchService. In order to run the integration tests you must update the argus-build.properties file to correctly setup the external LDAP you'll be testing against and the OpenTSDB endpoints to use as well as the Redis cluster. The snippet below shows the specific properties that should be modified in the argus-build.properties file. Of course, after you make these updates, you must re-install the resource filter dependencies as described above and execute the clean goal, before running the integration tests.

# The LDAP endpoint to use
service.property.auth.ldap.endpoint=ldaps://ldaps.yourdomain.com:636
# A list of comma separated search paths used to query the DN of users attempting to authenticate.
# This example lists two separate search bases.  One for users and one for service accounts.
service.property.auth.ldap.searchbase=OU=active,OU=user,DC=yourdomain,DC=com:OU=active,OU=robot,DC=yourdomain,DC=com
# This specifies of the DN for the privileged user that is used to bind and subsequently execute the search for user DN's
service.property.auth.ldap.searchdn=CN=argus_admin,OU=active,OU=user,DC=yourdomain,DC=com
# The password for the privileged user above.
service.property.auth.ldap.searchpwd=Argu5R0cks!
# The LDAP field with which the username provided during a login attempt, will be matched.
# This is used so Argus can obtain the DN for the user attempting to login, and subsequently attempt to bind as that user.
service.property.auth.ldap.usernamefield=sAMAccountName
# The TSDB read endpoint
service.property.tsdb.endpoint.read=http://readtsdb.yourdomain.com:4466
# The TSDB write endpoint
service.property.tsdb.endpoint.write=http://writetsdb.yourdomain.com:4477
# The Redis cache cluster information
service.property.cache.redis.cluster=redis0.mycompany.com:6379,redis1.mycompany.com:6389

Once the modifications have been made and the resource filters re-installed, you're ready to run the complete suite of tests, including the integration tests.

mvn verify

Generating Coverage Reports

Coverage is calculated everytime tests are run for all modules with the exception of ArgusWeb. In order to generate a coverage report for a module, just cd into the module subdirectory and run the report generation target.

mvn jacoco:report

Coverage reports are generated in the target/site/jacoco directory.

Deploying & Running Argus

Please see the wiki for information on how to deploy, configure and run Argus.

argus's People

Contributors

anish avatar axdotl avatar bsura avatar cannakula-sfdc avatar cnardi-dev avatar colbyguan avatar dilip-devaraj avatar dilipdevaraj-sfdc avatar dongpujin avatar gaurav-kumar-sfdc avatar gauravk48 avatar greatruofan avatar hivehand avatar jmr50 avatar justinharringa avatar kow1011b avatar krasv avatar melan avatar naveenreddykarri avatar nkunal avatar pfu-salesforce avatar pratiksha-shah avatar raj-sarkapally avatar rmelick avatar ryanguest avatar studanshu avatar sundeepsf avatar taozhangsfdc avatar xizi-xu avatar yjanezou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

argus's Issues

Lots of errors in logs when no alerts are in DB

I see a lot of errors in my alert client clients when I haven't defined any alerts.

argus-alert-client          | [ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 15:06:46.359 | alertclient-0 |  WARN ] Exception in alerter: java.lang.IllegalArgumentException: IDs list cannot be null or empty.

parent pom is not published to maven central

The argus parent pom is not published to maven central (http://search.maven.org/#search%7Cga%7C1%7Cargus). This makes it impossible for maven to resolve dependencies on the subprojects correctly.

I created a test project (https://github.com/rmelick/argusclient) that shows the following failure if you build with mvn package -s settings.xml -U

[ERROR] Failed to execute goal on project argustest: Could not resolve dependencies for project net.rmelick.maven:argustest:jar:1.0-SNAPSHOT: Failed to collect dependencies at com.salesforce.argus:argus-client:jar:2.1.1: Failed to read artifact descriptor for com.salesforce.argus:argus-client:jar:2.1.1: Could not find artifact com.salesforce.argus:argus:pom:2.1.1 in central (https://repo.maven.apache.org/maven2) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

java.net.BindException: Address already in use

when i use mvn test command,

i don't kown which port is in use? Argus

[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2017-02-15 11:40:49.501 | Thread-13 | ERROR ] From testing server (random state: false) for instance: InstanceSpec{dataDirectory=/var/folders/c5/cf47njj11kv8_6z7h4z22rg80000gn/T/1487130049499-0, port=2185, electionPort=50039, quorumPort=50040, deleteDataDirectoryOnClose=true, serverId=9, tickTime=-1, maxClientCnxns=-1} org.apache.curator.test.InstanceSpec@889
java.net.BindException: Address already in use

thanks

Tests fail with German default locale

testHoltWintersDeviation(com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest)  Time elapsed: 0.031 sec  <<< FAILURE!
java.lang.AssertionError
	at com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest.testHoltWintersDeviation(HoltWintersTransformTest.java:115)

testHoltWintersForecast(com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest)  Time elapsed: 0 sec  <<< FAILURE!
java.lang.AssertionError
	at com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest.testHoltWintersForecast(HoltWintersTransformTest.java:80)

Improving the assert statement, the error is more useful:

java.lang.AssertionError: 
Expected :{2=0, 3=0.1, 4=0.27997, 5=0.72374, 6=1.57824, 7=1.56059, 8=1.46463, 9=1.45847, 10=1.47683}
Actual   :{2=0, 3=0,1, 4=0,27997, 5=0,72374, 6=1,57824, 7=1,56059, 8=1,46463, 9=1,45847, 10=1,47683}

Argus Web CORS issue

Argus login/authentication does not succeeds because login request from the web app to servlet running on tomcat container is a Cross Origin HTTP request.

I have deployed the war file on tomcat-7.0.69.

screen shot 2016-05-18 at 4 06 29 pm

Discovery service returning duplicate result

Discovery service returning duplicate result

For example, there are two metrics:
core.WAS.SP1.a
core.WAS.SP1.b

when user type core.was.* the result is showing two record as below:
core.WAS.SP1
core.WAS.SP1
rather than just one.

As a result, the discovery window is easily fully occupied and become less useful.

Discovery Service fetch limit estimation

Discovery Service fetch limit estimation: currently is based on the assumption of minute resolution and multiplies with time range. This causing lots of query been overestimated and therefore failed to fetch.

To fix this, purposing, please first take a sample to determine the average resolution and then use it to decide the data volume that will be fetched.

Let me know if you need more details.

@saaja-sfdc @kgowdru

Preferences for wiki updates?

Once the docker images are available (#302), I'll need to update the wiki to describe how to properly use them. I was also thinking of enhancing a few other parts that confused me when I tried to get started deploying Argus. Before I start, I have the following questions:

  • How do wiki updates work? Should I clone the wiki (https://github.com/salesforce/Argus.wiki.git) and create a new branch with my changes?
  • I would like to add a few architecture diagrams to the wiki. Is there a preferred diagram editor (hopefully also accessible on linux?). I was thinking of https://www.draw.io/, Google Docs, or OpenOffice Draw.
  • Is there general guidance about what goes in the README vs. what goes in the wiki?

Unit tests are failing for WARDEN service

Hi,
I recently installed and compiled Argus successfully. However, the test is failing in "ArgusCore" subsystem for Warden service. Has anyone seen this?
Attached my config file and error messages.

transform purposed: deduct

hi @saaja-sfdc @rajsarkapally-sfdc
Recently I'm needing this transform. Please let me know if it is already implemented. Otherwise I will write one and have it pr.

DEDUCT(M1, M2) It returns M1 - M2 from time dimension. It remove all items from M1 whose keys are found in M2. This is the opposite of UNION.

Note: it is not MIN, MIN only align both Metric up by key, and do the value minus.

Usage: DEDUCT(List).
Example:
input:
M1 is 1L:10, 2L:10, 3L:20
M2 is 1L:10

output as DEDUCT(M1,M2)
M is 2L:10, 3L:20

No way to change the case of a parameter place holder

I have a dashboard with an <ag-text> parameter for datacenter.

<ag-text type="text" name="datacenter" label="Datacenter" default="STAGE"></ag-text>

Some of my metrics have the datacenter variable in UPPERCASE, and some are in lowercase. Argus needs a way to adjust the case of text parameters used in binding expressions.

My hacky fix is to have a second <ag-text> parameter for datacenter-lower-case, but this is very suboptimal.

unit test failed

image
when i try to run unit test,error occur see picture above.
I put persistence.xml under test resources dir
image
and argus.properties has config items for jpa
image
bug every time ๏ผŒI would see that error, can you help me to solve it?

java.lang.IllegalArgumentException: Unknown Entity bean class: class com.salesforce.dva.argus.entity.PrincipalUser, please verify that this class has been marked with the @entity annotation.
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.find(EntityManagerImpl.java:718)
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.find(EntityManagerImpl.java:599)

Insufficient error message when pushing metrics to OpenTSDB fails

Scenario: Trying to put ~20k metrics in a single HTTP request to /argus/collection/metrics fails with misleading error message.

When pushing 20k metrics via /argus/collection/metrics, the request fails with a 500. Following insufficient log message occurs in log of the metric-client:

[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2017-05-05 14:09:23.658 | ccommitclient-1 |  INFO ] Error occured while committing metrics. Reason com.salesforce.dva.argus.system.SystemException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@385564dd; line: 1, column: 2]

The root cause for this error is that chunked requests to OpenTSDB are not allowed per default. So the REST call to openTSDBs /api/put returns 400 and some HTML content (which contains the actual error message 'Chunked request not supported.')

Refactor PhoenixTSDBEngine to remove a few RESOURCE_LEAKS reported by Coverity

** CID 163940: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 71 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(java.sql.Connection, com.salesforce.dva.argus.entity.Metric)()


*** CID 163940: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 71 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(java.sql.Connection, com.salesforce.dva.argus.entity.Metric)()
65
66 String viewName = getPhoenixViewName(metric.getScope(), metric.getMetric());
67
68 String tagkeys = "", tagvalues = "";
69 for(Map.Entry<String, String> tagEntry : metric.getTags().entrySet()) {
70 tagkeys += """ + tagEntry.getKey() + "",";

CID 163940:  FindBugs: Performance  (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(Connection, Metric) concatenates strings using + in a loop.

71 tagvalues += "'" + tagEntry.getValue() + "',";
72 }
73
74 if(metric.getDisplayName() != null && !metric.getDisplayName().isEmpty()) {
75 tagkeys += "DISPLAY_NAME,";
76 tagvalues += "'" + metric.getDisplayName() + "',";

** CID 163939: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 203 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(com.salesforce.dva.argus.service.tsdb.MetricQuery)()


*** CID 163939: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 203 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(com.salesforce.dva.argus.service.tsdb.MetricQuery)()
197 List tagList = Arrays.asList(tagValue.split("\|"));
198 String tagValues = tagList.stream()
199 .map((s) -> "'" + s + "'")
200 .collect(Collectors.joining(", "));
201 tagWhereClaue += " AND "" + tagEntry.getKey() + "" IN (" + tagValues + ")";
202 } else {

CID 163939:  FindBugs: Performance  (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(MetricQuery) concatenates strings using + in a loop.

203 tagWhereClaue += " AND "" + tagEntry.getKey() + "" IN ('" + tagValue + "')";
204 }
205 }
206
207 String selectSql = MessageFormat.format("SELECT {0}(val) val, ts epoch_time, display_name, units {1} FROM {2}"
208 + " WHERE ts <= ? AND ts >= ? {3}"

** CID 163932: Resource leaks (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 161 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.selectMetrics(java.sql.Connection, com.salesforce.dva.argus.service.tsdb.MetricQuery)()


*** CID 163932: Resource leaks (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 161 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.selectMetrics(java.sql.Connection, com.salesforce.dva.argus.service.tsdb.MetricQuery)()
155 metric.setDatapoints(datapoints);
156 metric.setDisplayName(displayName);
157 metric.setUnits(units);
158 metrics.put(identifier, metric);
159 }
160 }

CID 163932:  Resource leaks  (RESOURCE_LEAK)
Variable "preparedStmt" going out of scope leaks the resource it refers to.

161 } catch(SQLException sqle) {
162 _logger.warn("Failed to read data from Phoenix.", sqle);
163 }
164
165 return new ArrayList<>(metrics.values());
166 }

** CID 163929: (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 61 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 67 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()


*** CID 163929: (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 61 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
55 _connection = DriverManager.getConnection(_phoenixJDBCUrl, props);
56 } catch (SQLException e) {
57 throw new SystemException("Failed to create connection to phoenix using jdbc url: " + _phoenixJDBCUrl, e);
58 }
59
60 try {

CID 163929:    (RESOURCE_LEAK)
Failing to save or close resource created by "_connection.createStatement()" leaks it.

61 _connection.createStatement().execute("CREATE SEQUENCE IF NOT EXISTS METRIC_ID_SEQ");
62 } catch (SQLException e) {
63 throw new SystemException("Failed to create sequence : " + _phoenixJDBCUrl, e);
64 }
65
66 try {
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 67 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
61 _connection.createStatement().execute("CREATE SEQUENCE IF NOT EXISTS METRIC_ID_SEQ");
62 } catch (SQLException e) {
63 throw new SystemException("Failed to create sequence : " + _phoenixJDBCUrl, e);
64 }
65
66 try {

CID 163929:    (RESOURCE_LEAK)
Failing to save or close resource created by "_connection.createStatement()" leaks it.

67 _connection.createStatement().execute("CREATE TABLE ARGUS.METRICS (id INTEGER NOT NULL, ts DATE NOT NULL, val DOUBLE, display_name varchar, units varchar CONSTRAINT PK PRIMARY KEY(id,ts)) APPEND_ONLY_SCHEMA = true, UPDATE_CACHE_FREQUENCY = 900000, AUTO_PARTITION_SEQ=METRIC_ID_SEQ");
68 // TODO change the create table ddl to IF NOT EXISTS PHOENIX-3660 is fixed
69 } catch (TableAlreadyExistsException e) {
70 System.out.println();
71 } catch (SQLException e) {
72 throw new SystemException("Failed to create base table: " + _phoenixJDBCUrl, e);

Runtime configuration of database connection

I would like to be able to configure the database connection (at a minimum the user name and password) at runtime. This would allow me to deploy my same built war to multiple environments (for example test and production).

It seems there are a few comments out there about how to do it for hibernate or through directy property setting in java (http://stackoverflow.com/questions/8324821/properties-reference-for-hibernate-in-persistence-xml and http://stackoverflow.com/questions/3935394/how-to-externalize-properties-from-jpas-persistence-xml). How do you do it at salesforce?

Getting more done in GitHub with ZenHub

Hola! @bsura has created a ZenHub account for the SalesforceEng organization. ZenHub is the leading team collaboration and project management solution built for GitHub.


How do I use ZenHub?

To get set up with ZenHub, all you have to do is download the browser extension and log in with your GitHub account. Once you do, youโ€™ll get access to ZenHubโ€™s complete feature-set immediately.

What can ZenHub do?

ZenHub adds a series of enhancements directly inside the GitHub UI:

  • Real-time, customizable task boards for GitHub issues;
  • Burndown charts, estimates, and velocity tracking based on GitHub Milestones;
  • Personal to-do lists and task prioritization;
  • โ€œ+1โ€ button for GitHub issues and comments;
  • Drag-and-drop file sharing;
  • Time-saving shortcuts like a quick repo switcher.

Add ZenHub to GitHub

Still curious? See more ZenHub features or read user reviews. This issue was written by your friendly ZenHub bot, posted by request from @bsura.

ZenHub Board

Plans to upgrade to HBase 1.x?

Per ArgusCore/pom.xml, it looks like Argus is currently using HBase 0.98:

<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>0.98.8-hadoop2</version>

Are there any plans for upgrading to HBase 1.x APIs?

Update default branch to `develop` via GitHub admin interface

Per issue #314, the main branch for the repository is develop, not master.

Could an admin of the repo please update the default branch via GitHub settings?

The default branch is considered the base branch in your repository, against which all pull requests and code commits are automatically made, unless you specify a different branch.

Your default branch is named master. If you have admin rights over a repository on GitHub, you can change the default branch on the repository.

Cross site scripting / CORS issue during login

I'm having trouble with CORS errors when I deploy argus locally.

I have the following setup:

  • port 8081: argus web services (the tomcat war with the rest api)
  • port 8082: argus web (the node application / frontend)

When I attempt to login from http://localhost:8082/app/#/login, I see the OPTIONS POST request to http://localhost:8081/argus/auth/login, and the following response headers

HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Set-Cookie: JSESSIONID=EDD932C66AF1271426DF42D846450BAC; Path=/argus; HttpOnly
Allow: POST,OPTIONS
Last-modified: Mon, 05 Dec 2016 12:09:13 UTC
Content-Type: application/vnd.sun.wadl+xml
Content-Length: 843
Date: Mon, 05 Dec 2016 12:09:13 GMT

network

From the javascript console, I see a CORS error

XMLHttpRequest cannot load http://localhost:8081/argus/auth/login. Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8082' is therefore not allowed access.

console

From reading stack overflow, it seems that since the web services are on a different port from the node app, javascript detects it as a different origin.

So, the question is, how do you normally deploy locally? I could add a CORS filter to the war to set Access-Control-Allow-Origin: *, but that should probably not be set when deploying in production.

This setup should be easy to reproduce with the docker-compose file at https://github.com/rmelick/Argus/tree/docker-images/ArgusDocker/simple.

Bug: Downsample transform, the timestamp after downsample is not accurate

Bug: Downsample transform, the timestamp after downsample is not accurate

For example:
When time range is smaller than downsample resolution.

-2d:REDUCED.SLA.db:ImpactedMin:avg

DOWNSAMPLE(-2d:REDUCED.SLA.db:ImpactedMin:avg,#99d-avg#)

https://github.com/salesforce/Argus/blob/develop/ArgusCore/src/main/java/com/salesforce/dva/argus/service/metric/transform/DownsampleTransform.java

    /*
     * i.e.
     * on hour level, 01:01:30 => 01:00:00
     * on minute level, 01:01:30 => 01:01:00
     * on second level, 01:01:30 => 01:01:30
     */
    public static Long downsamplerTimestamp(Long millitimestamp, long windowSize) {
    	return millitimestamp-(millitimestamp%windowSize);
    }

Using ArgusService or ArgusHttpClient in client unit tests

We are planning to use the argus-sdk inside our java code to push metrics into Argus. I would like to write some unit tests for our code, so I would to create fake instances of ArgusService. For example, I want to make sure that if any calls to Argus fail, the rest of our code is able to continue functioning. If ArgusService was an interface I could easily implement a version that always threw exceptions, for example. Any thoughts?

How to control topic replica factor?

We're running a kafka cluster with three brokers. But the Argus related topics has no replication configured (replica-factor 1).
Is it possible to control the replica-factor for the single queues (e.g. argusMetricQueue) within Argus?

image

Allowing openTSDB 2.3 aggregators

We are currently use Argus with openTSDB 2.3. I've seen that Argus checks the expressions with the MetricsReader and only allows aggregators up to openTSDB version 2.0, thus aggregators like "count", "last", "first", "none" and all the percentile stuff are not available for expressions. Is there a reason for that filtering or can we simply extend the aggregators list and use all openTSDB functions available in 2.3?
Best regards,
Stefan

Clean up getting started

The Getting Started page has a number of bad links (links that don't go anywhere), and a few other issues.

Update references to salesforceEng project

I noticed the travis CI button and some of the wiki links are still referring to the old project name (salesforceEng/Argus) before the migration (#108).

I couldn't figure out if the Coverity job also needed to be modified.

Build failures with maven 3.3.3 and maven-compiler-plugin 3.2

I see the following failure when I try to build on Ubuntu 15.10 with maven 3.3

[INFO] Compiling 242 source files to /home/rmelick/src/other/rmelick-Argus/ArgusCore/target/classes
An exception has occurred in the compiler (1.8.0_101). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program and the following diagnostic in your report. Thank you.
java.lang.IllegalStateException: endPosTable already set
        at com.sun.tools.javac.util.DiagnosticSource.setEndPosTable(DiagnosticSource.java:136)
        at com.sun.tools.javac.util.Log.setEndPosTable(Log.java:350)
        at com.sun.tools.javac.main.JavaCompiler.parse(JavaCompiler.java:667)
        at com.sun.tools.javac.main.JavaCompiler.parseFiles(JavaCompiler.java:950)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment$Round.<init>(JavacProcessingEnvironment.java:892)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment$Round.next(JavacProcessingEnvironment.java:921)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment.doProcessing(JavacProcessingEnvironment.java:1187)
        at com.sun.tools.javac.main.JavaCompiler.processAnnotations(JavaCompiler.java:1170)
        at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:856)
        at com.sun.tools.javac.main.Main.compile(Main.java:523)
        at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:129)
        at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:138)
        at org.codehaus.plexus.compiler.javac.JavaxToolsCompiler.compileInProcess(JavaxToolsCompiler.java:125)
        at org.codehaus.plexus.compiler.javac.JavacCompiler.performCompile(JavacCompiler.java:169)
        at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:823)
        at org.apache.maven.plugin.compiler.CompilerMojo.execute(CompilerMojo.java:129)
        at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
        at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
        at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] An unknown compilation problem occurred
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Argus .............................................. SUCCESS [  1.107 s]
[INFO] ArgusCore .......................................... FAILURE [  2.117 s]
[INFO] ArgusWebServices ................................... SKIPPED
[INFO] ArgusClient ........................................ SKIPPED
[INFO] ArgusSDK ........................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.685 s
[INFO] Finished at: 2016-12-05T10:56:50+01:00
[INFO] Final Memory: 45M/502M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.2:compile (default-compile) on project argus-core: Compilation failure
[ERROR] An unknown compilation problem occurred
[ERROR] -> [Help 1]

rmelick@rmelick-ld:~/src/other/rmelick-Argus$ mvn -v
Apache Maven 3.3.3
Maven home: /usr/share/maven
Java version: 1.8.0_101, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-8-oracle/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.2.0-42-generic", arch: "amd64", family: "unix"

PSQLException when using postgres for persistence

The wiki mentioned that argus had been tested with postgres as a persistant database. I attempted to configure that, and ran into an exception trying to create a JPAEntity

[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.611 | ost-startStop-1 | DEBUG ] Retrieving the administrative user.
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.620 | ost-startStop-1 | DEBUG ] Query for user having id 1 resulted in : null
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.629 | ost-startStop-1 | DEBUG ] Updated user to : PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true}
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.664 | ost-startStop-1 | DEBUG ] Created audit object Audit{id=1, createdDate=Thu Dec 08 09:00:28 GMT 2016, message=Updated user : PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true}, hostName=argus-web-services, object=1}
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.679 | ost-startStop-1 | ERROR ] SystemMain startup aborted.
com.salesforce.dva.argus.system.SystemException: javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
  Hint: You will need to rewrite or cast the expression.
  Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
	bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
	at com.salesforce.dva.argus.service.jpa.DefaultUserService.findAdminUser(DefaultUserService.java:148) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
	at com.google.inject.persist.jpa.JpaLocalTxnInterceptor.invoke(JpaLocalTxnInterceptor.java:70) ~[guice-persist-4.0.jar:na]
	at com.salesforce.dva.argus.system.SystemMain.doStart(SystemMain.java:114) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
	at com.salesforce.dva.argus.system.SystemService.start(SystemService.java:104) [argus-core-2.2.1-SNAPSHOT.jar:na]
	at com.salesforce.dva.argus.ws.listeners.ArgusWebServletListener.contextInitialized(ArgusWebServletListener.java:92) [classes/:na]
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5118) [catalina.jar:7.0.73]
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5634) [catalina.jar:7.0.73]
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) [catalina.jar:7.0.73]
	at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899) [catalina.jar:7.0.73]
	at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875) [catalina.jar:7.0.73]
	at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) [catalina.jar:7.0.73]
	at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1092) [catalina.jar:7.0.73]
	at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1984) [catalina.jar:7.0.73]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111-internal]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_111-internal]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111-internal]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111-internal]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111-internal]
Caused by: javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
  Hint: You will need to rewrite or cast the expression.
  Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
	bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
	at org.eclipse.persistence.internal.jpa.EntityManagerImpl.flush(EntityManagerImpl.java:879) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at com.salesforce.dva.argus.service.jpa.DefaultUserService.updateUser(DefaultUserService.java:128) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
	at com.google.inject.persist.jpa.JpaLocalTxnInterceptor.invoke(JpaLocalTxnInterceptor.java:62) ~[guice-persist-4.0.jar:na]
	at com.salesforce.dva.argus.service.jpa.DefaultUserService.findAdminUser(DefaultUserService.java:145) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
	... 17 common frames omitted
Caused by: org.eclipse.persistence.exceptions.DatabaseException: 
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
  Hint: You will need to rewrite or cast the expression.
  Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
	bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
	at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:684) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeCall(DatabaseAccessor.java:560) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.AbstractSession.basicExecuteCall(AbstractSession.java:2055) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.sessions.server.ClientSession.executeCall(ClientSession.java:306) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.executeCall(DatasourceCallQueryMechanism.java:242) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.insertObject(DatasourceCallQueryMechanism.java:363) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.StatementQueryMechanism.insertObject(StatementQueryMechanism.java:165) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.StatementQueryMechanism.insertObject(StatementQueryMechanism.java:180) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.DatabaseQueryMechanism.insertObjectForWrite(DatabaseQueryMechanism.java:489) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.InsertObjectQuery.executeCommit(InsertObjectQuery.java:80) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.InsertObjectQuery.executeCommitWithChangeSet(InsertObjectQuery.java:90) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.queries.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:301) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:58) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.DatabaseQuery.execute(DatabaseQuery.java:904) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:803) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:108) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:85) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2896) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1857) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1839) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1790) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.CommitManager.commitNewObjectsForClassWithChangeSet(CommitManager.java:227) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.CommitManager.commitAllObjectsForClassWithChangeSet(CommitManager.java:194) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:139) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:4263) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1441) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithPreBuiltChangeSet(UnitOfWorkImpl.java:1587) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.writeChanges(RepeatableWriteUnitOfWork.java:455) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.jpa.EntityManagerImpl.flush(EntityManagerImpl.java:874) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	... 20 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
  Hint: You will need to rewrite or cast the expression.
  Position: 116
	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2103) ~[postgresql-9.1-901-1.jdbc4.jar:na]
	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1836) ~[postgresql-9.1-901-1.jdbc4.jar:na]
	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257) ~[postgresql-9.1-901-1.jdbc4.jar:na]
	at org.postgresql.jdbc3.AbstractJdbc3Statement.getParameterMetaData(AbstractJdbc3Statement.java:414) ~[postgresql-9.1-901-1.jdbc4.jar:na]
	at org.eclipse.persistence.platform.database.DerbyPlatform.setNullFromDatabaseField(DerbyPlatform.java:302) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.setParameterValueInDatabaseCall(DatabasePlatform.java:2477) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.databaseaccess.DatabaseCall.prepareStatement(DatabaseCall.java:797) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:621) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
	... 48 common frames omitted

My argus-build.properties looks like the following during the build process

# Default settings for unit and integration tests.
build.property.persistence.unit=<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>\n\
<exclude-unlisted-classes>false</exclude-unlisted-classes>\n\
<properties>\n\
	<property name="javax.persistence.schema-generation.database.action" value="drop-and-create-tables"/>\n\
	<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver"/>\n\
	<property name="javax.persistence.jdbc.url" value="jdbc:postgresql://postgres:5432/argus_user"/>\n\
	<property name="javax.persistence.jdbc.user" value="argus_user"/>\n\
	<property name="javax.persistence.jdbc.password" value="password"/>\n\
	<property name="eclipselink.ddl-generation" value="drop-and-create-tables"/>\n\
	<property name="eclipselink.logging.level" value="SEVERE"/>\n\
	<property name="eclipselink.logging.parameters" value="true"/>\n\
	<property name="eclipselink.target-database" value="Auto"/>\n\
	<property name="eclipselink.canonicalmodel.subpackage" value="unit"/>\n\
</properties>
build.property.secure.cookies=false
[email protected]
system.property.log.level=ERROR
system.property.mail.enabled=false
service.property.mq.connection.count=2
service.property.mq.endpoint=vm\://localhost?broker.persistent\=false
service.property.auth.ldap.authtype=simple
service.property.auth.ldap.endpoint=ldaps://ldaps.mycomany.com:636
service.property.auth.ldap.searchbase=OU=active,OU=users,DC=mycompany,DC=com:OU\=active,OU\=robot,DC\=mycompany,DC\=com
service.property.auth.ldap.searchdn=CN=argus_service,OU=active,OU=users,DC=mycompany,DC=com
service.property.auth.ldap.searchpwd=argus_service_password
service.property.auth.ldap.usernamefield=sAMAccountName
service.property.mail.alerturl.template=https\://localhost\:8443/argus/\#/alerts/$alertid$
service.property.mail.metricurl.template=https\://localhost\:8443/argus/\#/viewmetrics?expression\=$expression$
service.property.mail.smtp.auth=false
service.property.mail.smtp.host=smtprelay.mycompany.com
service.property.mail.smtp.starttls.enable=false
service.property.tsdb.connection.count=2
service.property.tsdb.endpoint.read=http://tsdbread.mycompany.com:4466
service.property.tsdb.endpoint.timeout=10000
service.property.tsdb.endpoint.write=http://tsdbwrite.mycompany.com:4477
service.property.cache.redis.cluster=redis0.mycompany.com:6379,redis1.mycompany.com:6389
asynchbase.property.hbase.zookeeper.connect=host1,host2:2181

and the following during runtime

# Default settings for unit and integration tests.
build.property.persistence.unit=<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>\n\
<exclude-unlisted-classes>false</exclude-unlisted-classes>\n\
<properties>\n\
	<property name="javax.persistence.schema-generation.database.action" value="drop-and-create-tables"/>\n\
	<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver"/>\n\
	<property name="javax.persistence.jdbc.url" value="jdbc:postgresql://postgres:5432/argus_user"/>\n\
	<property name="javax.persistence.jdbc.user" value="argus_user"/>\n\
	<property name="javax.persistence.jdbc.password" value="password"/>\n\
	<property name="eclipselink.ddl-generation" value="drop-and-create-tables"/>\n\
	<property name="eclipselink.logging.level" value="SEVERE"/>\n\
	<property name="eclipselink.logging.parameters" value="true"/>\n\
	<property name="eclipselink.target-database" value="DERBY"/>\n\
	<property name="eclipselink.canonicalmodel.subpackage" value="unit"/>\n\
</properties>
build.property.secure.cookies=false
[email protected]
system.property.log.level=DEBUG
system.property.mail.enabled=false

# skip ldap (any user can log in with any password)
service.binding.auth=com.salesforce.dva.argus.service.auth.NoAuthService

service.property.mail.alerturl.template=https\://localhost\:8443/argus/\#/alerts/$alertid$
service.property.mail.metricurl.template=https\://localhost\:8443/argus/\#/viewmetrics?expression\=$expression$
service.property.mail.smtp.auth=false
service.property.mail.smtp.host=smtprelay.mycompany.com
service.property.mail.smtp.starttls.enable=false
service.property.tsdb.connection.count=2
service.property.tsdb.endpoint.read=http://opentsdb:4242
service.property.tsdb.endpoint.timeout=10000
service.property.tsdb.endpoint.write=http://opentsdb:4242
service.property.cache.redis.cluster=redis:6379

# kafka
service.property.mq.kafka.brokers=kafka:9092
service.property.mq.zookeeper.connect=kafka:2181

Getting Argus running does not work

Hi,
I tried to get Argus working in a docker environment. The documentation seems to be incomplete.
Are there other documentation resources with detailed information? Especially the documentation about how to get the ArgusWeb App working is missing as far as I know. I read the wiki and had a look into the readme files. Thank you!

Update dependencies to address CVE/CWEs

Looking for clear Postgres integration instructions

Hi, I'm using neither Docker nor Eclipse. Trying to install Argus on an AWS EC2 instance. When I run mvn test, I get "Class [org.postgressql.Driver] not found."

I've tried adding to CLASSPATH, adding jar as a Maven dependency, installing the postgressql-42.1.0.jar into my local Maven repository (said it was successful), etc. No dice.

What are the specific steps necessary to install and use Postgres with Argus? I wish the documentation was much clearer when it comes to prereqs and getting started for non-Docker non-Eclipse shops.

ArgusWeb can't login

Hi,
I want to launch the Argus platform without ldap support. I did this by adding

service.binding.auth=com.salesforce.dva.argus.service.auth.NoAuthService

to the argus.properties file. For the argus.properties I used the blueprint in the Getting Started Section for the integration tests. I hope that's ok. When I open the website a login prompt shows up.
Is this behavior correct? Which credentials do I have to enter?
If I enter some credentials and hit the submit button a toast message 'Login failed' shows up.
Confusing to me is that the only request which is sent after hitting the submit button is an OPTIONS request, which gets a response code of 200 and returns the verbs POST and OPTIONS.
I would expect a POST request, which should trigger a message like 'Login failed'.

Basically I have two questions.
How do I enable the NoAuthService and how can I check that it is working?
Is the login behavior described above as expected?

Thanks for your help!

Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.10:unpack-dependencies

Hey guys, I am trying to install Argus in my machine. I've installed all (I hope I did not miss any) the dependencies mentioned in https://github.com/SalesforceEng/Argus/wiki/Getting%20Started. But when I try running mvn test, I got this error:
INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Argus ............................................. SUCCESS [0.555s]
[INFO] ArgusCore ......................................... SUCCESS [4:03.576s]
[INFO] ArgusWebServices .................................. FAILURE [0.062s]
[INFO] ArgusClient ....................................... SKIPPED
[INFO] ArgusSDK .......................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4:04.305s
[INFO] Finished at: Wed Sep 28 18:06:46 WIB 2016
[INFO] Final Memory: 64M/929M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.10:unpack-dependencies (unpack-shared-resources) on project argus-webservices: Artifact has not been packaged yet. When used on reactor artifact, unpack should be executed after packaging: see MDEP-98. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :argus-webservices

Any advice? Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.