salesforce / argus Goto Github PK
View Code? Open in Web Editor NEWTime series monitoring and alerting platform.
License: BSD 3-Clause "New" or "Revised" License
Time series monitoring and alerting platform.
License: BSD 3-Clause "New" or "Revised" License
Hi,
I recently installed and compiled Argus successfully. However, the test is failing in "ArgusCore" subsystem for Warden service. Has anyone seen this?
Attached my config file and error messages.
hi @saaja-sfdc @rajsarkapally-sfdc
Recently I'm needing this transform. Please let me know if it is already implemented. Otherwise I will write one and have it pr.
DEDUCT(M1, M2) It returns M1 - M2 from time dimension. It remove all items from M1 whose keys are found in M2. This is the opposite of UNION.
Note: it is not MIN, MIN only align both Metric up by key, and do the value minus.
Usage: DEDUCT(List).
Example:
input:
M1 is 1L:10, 2L:10, 3L:20
M2 is 1L:10
output as DEDUCT(M1,M2)
M is 2L:10, 3L:20
i think it's so hard to install, wiki page is not detail.
Once the docker images are available (#302), I'll need to update the wiki to describe how to properly use them. I was also thinking of enhancing a few other parts that confused me when I tried to get started deploying Argus. Before I start, I have the following questions:
The includeArtifactIds tag is misspelled in several pom files
<includeArtifacIds>argus</includeArtifacIds>
Bug: Downsample transform, the timestamp after downsample is not accurate
For example:
When time range is smaller than downsample resolution.
-2d:REDUCED.SLA.db:ImpactedMin:avg
DOWNSAMPLE(-2d:REDUCED.SLA.db:ImpactedMin:avg,#99d-avg#)
/*
* i.e.
* on hour level, 01:01:30 => 01:00:00
* on minute level, 01:01:30 => 01:01:00
* on second level, 01:01:30 => 01:01:30
*/
public static Long downsamplerTimestamp(Long millitimestamp, long windowSize) {
return millitimestamp-(millitimestamp%windowSize);
}
how to compile and setup web module ?
The argus parent pom is not published to maven central (http://search.maven.org/#search%7Cga%7C1%7Cargus). This makes it impossible for maven to resolve dependencies on the subprojects correctly.
I created a test project (https://github.com/rmelick/argusclient) that shows the following failure if you build with mvn package -s settings.xml -U
[ERROR] Failed to execute goal on project argustest: Could not resolve dependencies for project net.rmelick.maven:argustest:jar:1.0-SNAPSHOT: Failed to collect dependencies at com.salesforce.argus:argus-client:jar:2.1.1: Failed to read artifact descriptor for com.salesforce.argus:argus-client:jar:2.1.1: Could not find artifact com.salesforce.argus:argus:pom:2.1.1 in central (https://repo.maven.apache.org/maven2) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
Per issue #314, the main branch for the repository is develop
, not master
.
Could an admin of the repo please update the default branch via GitHub settings?
The default branch is considered the base branch in your repository, against which all pull requests and code commits are automatically made, unless you specify a different branch.
Your default branch is named
master
. If you have admin rights over a repository on GitHub, you can change the default branch on the repository.
https://github.com/aertoria/Argus.wiki.git
Thanks
To who-ever-is-admin: Follow Merge changes session at https://gist.github.com/larrybotha/10650410
Hi,
I want to launch the Argus platform without ldap support. I did this by adding
service.binding.auth=com.salesforce.dva.argus.service.auth.NoAuthService
to the argus.properties file. For the argus.properties I used the blueprint in the Getting Started Section for the integration tests. I hope that's ok. When I open the website a login prompt shows up.
Is this behavior correct? Which credentials do I have to enter?
If I enter some credentials and hit the submit button a toast message 'Login failed' shows up.
Confusing to me is that the only request which is sent after hitting the submit button is an OPTIONS request, which gets a response code of 200 and returns the verbs POST and OPTIONS.
I would expect a POST request, which should trigger a message like 'Login failed'.
Basically I have two questions.
How do I enable the NoAuthService and how can I check that it is working?
Is the login behavior described above as expected?
Thanks for your help!
testHoltWintersDeviation(com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest) Time elapsed: 0.031 sec <<< FAILURE!
java.lang.AssertionError
at com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest.testHoltWintersDeviation(HoltWintersTransformTest.java:115)
testHoltWintersForecast(com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest) Time elapsed: 0 sec <<< FAILURE!
java.lang.AssertionError
at com.salesforce.dva.argus.service.metric.transform.HoltWintersTransformTest.testHoltWintersForecast(HoltWintersTransformTest.java:80)
Improving the assert statement, the error is more useful:
java.lang.AssertionError:
Expected :{2=0, 3=0.1, 4=0.27997, 5=0.72374, 6=1.57824, 7=1.56059, 8=1.46463, 9=1.45847, 10=1.47683}
Actual :{2=0, 3=0,1, 4=0,27997, 5=0,72374, 6=1,57824, 7=1,56059, 8=1,46463, 9=1,45847, 10=1,47683}
Hi,
I tried to get Argus working in a docker environment. The documentation seems to be incomplete.
Are there other documentation resources with detailed information? Especially the documentation about how to get the ArgusWeb App working is missing as far as I know. I read the wiki and had a look into the readme files. Thank you!
We are planning to use the argus-sdk inside our java code to push metrics into Argus. I would like to write some unit tests for our code, so I would to create fake instances of ArgusService. For example, I want to make sure that if any calls to Argus fail, the rest of our code is able to continue functioning. If ArgusService was an interface I could easily implement a version that always threw exceptions, for example. Any thoughts?
I have a dashboard with an <ag-text>
parameter for datacenter
.
<ag-text type="text" name="datacenter" label="Datacenter" default="STAGE"></ag-text>
Some of my metrics have the datacenter
variable in UPPERCASE, and some are in lowercase. Argus needs a way to adjust the case of text parameters used in binding expressions.
My hacky fix is to have a second <ag-text>
parameter for datacenter-lower-case
, but this is very suboptimal.
Discovery service returning duplicate result
For example, there are two metrics:
core.WAS.SP1.a
core.WAS.SP1.b
when user type core.was.*
the result is showing two record as below:
core.WAS.SP1
core.WAS.SP1
rather than just one.
As a result, the discovery window is easily fully occupied and become less useful.
Scenario: Trying to put ~20k metrics in a single HTTP request to /argus/collection/metrics fails with misleading error message.
When pushing 20k metrics via /argus/collection/metrics, the request fails with a 500. Following insufficient log message occurs in log of the metric-client:
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2017-05-05 14:09:23.658 | ccommitclient-1 | INFO ] Error occured while committing metrics. Reason com.salesforce.dva.argus.system.SystemException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: java.io.StringReader@385564dd; line: 1, column: 2]
The root cause for this error is that chunked requests to OpenTSDB are not allowed per default. So the REST call to openTSDBs /api/put returns 400 and some HTML content (which contains the actual error message 'Chunked request not supported.')
Per ArgusCore/pom.xml
, it looks like Argus is currently using HBase 0.98:
<groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>0.98.8-hadoop2</version>
Are there any plans for upgrading to HBase 1.x APIs?
Hola! @bsura has created a ZenHub account for the SalesforceEng organization. ZenHub is the leading team collaboration and project management solution built for GitHub.
To get set up with ZenHub, all you have to do is download the browser extension and log in with your GitHub account. Once you do, you’ll get access to ZenHub’s complete feature-set immediately.
ZenHub adds a series of enhancements directly inside the GitHub UI:
Still curious? See more ZenHub features or read user reviews. This issue was written by your friendly ZenHub bot, posted by request from @bsura.
Experts,
Could you kindly share quick guidance on how I can specify the argus.properties to allow it to access the services with the right ports, after I have deployed the WAR as WEBAPPS in Tomcat?
Thanks!
I see a lot of errors in my alert client clients when I haven't defined any alerts.
argus-alert-client | [ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 15:06:46.359 | alertclient-0 | WARN ] Exception in alerter: java.lang.IllegalArgumentException: IDs list cannot be null or empty.
I would like to be able to deploy argus as a relatively simple docker image (similar to the opentsdb image: https://hub.docker.com/r/petergrace/opentsdb-docker/). I'm happy to create the images somewhere in this repository or a related one if other thinks it would be a good idea. Any thoughts or suggestions?
We have claimed https://hub.docker.com/u/salesforce. Project should be updated to reflect this docker organization.
Hi, I'm using neither Docker nor Eclipse. Trying to install Argus on an AWS EC2 instance. When I run mvn test, I get "Class [org.postgressql.Driver] not found."
I've tried adding to CLASSPATH, adding jar as a Maven dependency, installing the postgressql-42.1.0.jar into my local Maven repository (said it was successful), etc. No dice.
What are the specific steps necessary to install and use Postgres with Argus? I wish the documentation was much clearer when it comes to prereqs and getting started for non-Docker non-Eclipse shops.
** CID 163940: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 71 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(java.sql.Connection, com.salesforce.dva.argus.entity.Metric)()
*** CID 163940: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 71 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(java.sql.Connection, com.salesforce.dva.argus.entity.Metric)()
65
66 String viewName = getPhoenixViewName(metric.getScope(), metric.getMetric());
67
68 String tagkeys = "", tagvalues = "";
69 for(Map.Entry<String, String> tagEntry : metric.getTags().entrySet()) {
70 tagkeys += """ + tagEntry.getKey() + "",";
CID 163940: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION) com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.upsertMetrics(Connection, Metric) concatenates strings using + in a loop.
71 tagvalues += "'" + tagEntry.getValue() + "',";
72 }
73
74 if(metric.getDisplayName() != null && !metric.getDisplayName().isEmpty()) {
75 tagkeys += "DISPLAY_NAME,";
76 tagvalues += "'" + metric.getDisplayName() + "',";
** CID 163939: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 203 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(com.salesforce.dva.argus.service.tsdb.MetricQuery)()
*** CID 163939: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 203 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(com.salesforce.dva.argus.service.tsdb.MetricQuery)()
197 List tagList = Arrays.asList(tagValue.split("\|"));
198 String tagValues = tagList.stream()
199 .map((s) -> "'" + s + "'")
200 .collect(Collectors.joining(", "));
201 tagWhereClaue += " AND "" + tagEntry.getKey() + "" IN (" + tagValues + ")";
202 } else {
CID 163939: FindBugs: Performance (FB.SBSC_USE_STRINGBUFFER_CONCATENATION) com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.getPhoenixQuery(MetricQuery) concatenates strings using + in a loop.
203 tagWhereClaue += " AND "" + tagEntry.getKey() + "" IN ('" + tagValue + "')";
204 }
205 }
206
207 String selectSql = MessageFormat.format("SELECT {0}(val) val, ts epoch_time, display_name, units {1} FROM {2}"
208 + " WHERE ts <= ? AND ts >= ? {3}"
** CID 163932: Resource leaks (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 161 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.selectMetrics(java.sql.Connection, com.salesforce.dva.argus.service.tsdb.MetricQuery)()
*** CID 163932: Resource leaks (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBEngine.java: 161 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBEngine.selectMetrics(java.sql.Connection, com.salesforce.dva.argus.service.tsdb.MetricQuery)()
155 metric.setDatapoints(datapoints);
156 metric.setDisplayName(displayName);
157 metric.setUnits(units);
158 metrics.put(identifier, metric);
159 }
160 }
CID 163932: Resource leaks (RESOURCE_LEAK) Variable "preparedStmt" going out of scope leaks the resource it refers to.
161 } catch(SQLException sqle) {
162 _logger.warn("Failed to read data from Phoenix.", sqle);
163 }
164
165 return new ArrayList<>(metrics.values());
166 }
** CID 163929: (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 61 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 67 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
*** CID 163929: (RESOURCE_LEAK)
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 61 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
55 _connection = DriverManager.getConnection(_phoenixJDBCUrl, props);
56 } catch (SQLException e) {
57 throw new SystemException("Failed to create connection to phoenix using jdbc url: " + _phoenixJDBCUrl, e);
58 }
59
60 try {
CID 163929: (RESOURCE_LEAK) Failing to save or close resource created by "_connection.createStatement()" leaks it.
61 _connection.createStatement().execute("CREATE SEQUENCE IF NOT EXISTS METRIC_ID_SEQ");
62 } catch (SQLException e) {
63 throw new SystemException("Failed to create sequence : " + _phoenixJDBCUrl, e);
64 }
65
66 try {
/ArgusCore/src/main/java/com/salesforce/dva/argus/service/tsdb/PhoenixTSDBService.java: 67 in com.salesforce.dva.argus.service.tsdb.PhoenixTSDBService.(com.salesforce.dva.argus.system.SystemConfiguration, com.salesforce.dva.argus.service.MonitorService)()
61 _connection.createStatement().execute("CREATE SEQUENCE IF NOT EXISTS METRIC_ID_SEQ");
62 } catch (SQLException e) {
63 throw new SystemException("Failed to create sequence : " + _phoenixJDBCUrl, e);
64 }
65
66 try {
CID 163929: (RESOURCE_LEAK) Failing to save or close resource created by "_connection.createStatement()" leaks it.
67 _connection.createStatement().execute("CREATE TABLE ARGUS.METRICS (id INTEGER NOT NULL, ts DATE NOT NULL, val DOUBLE, display_name varchar, units varchar CONSTRAINT PK PRIMARY KEY(id,ts)) APPEND_ONLY_SCHEMA = true, UPDATE_CACHE_FREQUENCY = 900000, AUTO_PARTITION_SEQ=METRIC_ID_SEQ");
68 // TODO change the create table ddl to IF NOT EXISTS PHOENIX-3660 is fixed
69 } catch (TableAlreadyExistsException e) {
70 System.out.println();
71 } catch (SQLException e) {
72 throw new SystemException("Failed to create base table: " + _phoenixJDBCUrl, e);
Discovery Service fetch limit estimation: currently is based on the assumption of minute resolution and multiplies with time range. This causing lots of query been overestimated and therefore failed to fetch.
To fix this, purposing, please first take a sample to determine the average resolution and then use it to decide the data volume that will be fetched.
Let me know if you need more details.
Hey guys, I am trying to install Argus in my machine. I've installed all (I hope I did not miss any) the dependencies mentioned in https://github.com/SalesforceEng/Argus/wiki/Getting%20Started. But when I try running mvn test, I got this error:
INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Argus ............................................. SUCCESS [0.555s]
[INFO] ArgusCore ......................................... SUCCESS [4:03.576s]
[INFO] ArgusWebServices .................................. FAILURE [0.062s]
[INFO] ArgusClient ....................................... SKIPPED
[INFO] ArgusSDK .......................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4:04.305s
[INFO] Finished at: Wed Sep 28 18:06:46 WIB 2016
[INFO] Final Memory: 64M/929M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.10:unpack-dependencies (unpack-shared-resources) on project argus-webservices: Artifact has not been packaged yet. When used on reactor artifact, unpack should be executed after packaging: see MDEP-98. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :argus-webservices
Any advice? Thanks
If the requested URL is the context root without any trailing '/', then the getPathInfo() returns null resulting in an NPE.
I would like to be able to configure the database connection (at a minimum the user name and password) at runtime. This would allow me to deploy my same built war to multiple environments (for example test and production).
It seems there are a few comments out there about how to do it for hibernate or through directy property setting in java (http://stackoverflow.com/questions/8324821/properties-reference-for-hibernate-in-persistence-xml and http://stackoverflow.com/questions/3935394/how-to-externalize-properties-from-jpas-persistence-xml). How do you do it at salesforce?
in a rare case UI will crash: Discovery Discovery lookup pannel crash if no result fetched from discovery service.
Example:
Chrome/Firefox/Safari
Discovery lookup, scope text box: core.TYO
result:
Error msg: $scope.searchMetrics/<@https://argus-ui.data.sfdc.net/argus/js/controllers/viewMetrics.js:92:21
I noticed the travis CI button and some of the wiki links are still referring to the old project name (salesforceEng/Argus) before the migration (#108).
I couldn't figure out if the Coverity job also needed to be modified.
Please merge my changes to the Wiki. Contains descriptions of the following transforms:
Wiki: https://github.com/shouvikmani/Argus/wiki/Transforms
Repo: https://github.com/shouvikmani/Argus.wiki.git
when i try to run unit test,error occur see picture above.
I put persistence.xml under test resources dir
and argus.properties has config items for jpa
bug every time ,I would see that error, can you help me to solve it?
java.lang.IllegalArgumentException: Unknown Entity bean class: class com.salesforce.dva.argus.entity.PrincipalUser, please verify that this class has been marked with the @entity annotation.
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.find(EntityManagerImpl.java:718)
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.find(EntityManagerImpl.java:599)
how to intall? The document give on wiki is not clear.
The wiki mentioned that argus had been tested with postgres as a persistant database. I attempted to configure that, and ran into an exception trying to create a JPAEntity
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.611 | ost-startStop-1 | DEBUG ] Retrieving the administrative user.
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.620 | ost-startStop-1 | DEBUG ] Query for user having id 1 resulted in : null
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.629 | ost-startStop-1 | DEBUG ] Updated user to : PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true}
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.664 | ost-startStop-1 | DEBUG ] Created audit object Audit{id=1, createdDate=Thu Dec 08 09:00:28 GMT 2016, message=Updated user : PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true}, hostName=argus-web-services, object=1}
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2016-12-08 09:00:28.679 | ost-startStop-1 | ERROR ] SystemMain startup aborted.
com.salesforce.dva.argus.system.SystemException: javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
at com.salesforce.dva.argus.service.jpa.DefaultUserService.findAdminUser(DefaultUserService.java:148) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
at com.google.inject.persist.jpa.JpaLocalTxnInterceptor.invoke(JpaLocalTxnInterceptor.java:70) ~[guice-persist-4.0.jar:na]
at com.salesforce.dva.argus.system.SystemMain.doStart(SystemMain.java:114) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
at com.salesforce.dva.argus.system.SystemService.start(SystemService.java:104) [argus-core-2.2.1-SNAPSHOT.jar:na]
at com.salesforce.dva.argus.ws.listeners.ArgusWebServletListener.contextInitialized(ArgusWebServletListener.java:92) [classes/:na]
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5118) [catalina.jar:7.0.73]
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5634) [catalina.jar:7.0.73]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) [catalina.jar:7.0.73]
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899) [catalina.jar:7.0.73]
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875) [catalina.jar:7.0.73]
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) [catalina.jar:7.0.73]
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1092) [catalina.jar:7.0.73]
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1984) [catalina.jar:7.0.73]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111-internal]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_111-internal]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111-internal]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111-internal]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111-internal]
Caused by: javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.flush(EntityManagerImpl.java:879) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at com.salesforce.dva.argus.service.jpa.DefaultUserService.updateUser(DefaultUserService.java:128) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
at com.google.inject.persist.jpa.JpaLocalTxnInterceptor.invoke(JpaLocalTxnInterceptor.java:62) ~[guice-persist-4.0.jar:na]
at com.salesforce.dva.argus.service.jpa.DefaultUserService.findAdminUser(DefaultUserService.java:145) ~[argus-core-2.2.1-SNAPSHOT.jar:na]
... 17 common frames omitted
Caused by: org.eclipse.persistence.exceptions.DatabaseException:
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 116
Error Code: 0
Call: INSERT INTO JPAENTITY (ID, CREATEDDATE, DELETED, MODIFIEDDATE, CREATEDBY_ID, MODIFIEDBY_ID, DTYPE) VALUES (?, ?, ?, ?, ?, ?, ?)
bind => [1, 2016-12-08 09:00:28.627, false, 2016-12-08 09:00:28.627, null, null, PrincipalUser]
Query: InsertObjectQuery(PrincipalUser{userName=admin, [email protected], preferences={{}}, privileged=true})
at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:684) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeCall(DatabaseAccessor.java:560) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.AbstractSession.basicExecuteCall(AbstractSession.java:2055) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.sessions.server.ClientSession.executeCall(ClientSession.java:306) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.executeCall(DatasourceCallQueryMechanism.java:242) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.insertObject(DatasourceCallQueryMechanism.java:363) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.StatementQueryMechanism.insertObject(StatementQueryMechanism.java:165) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.StatementQueryMechanism.insertObject(StatementQueryMechanism.java:180) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.DatabaseQueryMechanism.insertObjectForWrite(DatabaseQueryMechanism.java:489) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.InsertObjectQuery.executeCommit(InsertObjectQuery.java:80) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.InsertObjectQuery.executeCommitWithChangeSet(InsertObjectQuery.java:90) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.queries.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:301) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:58) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.DatabaseQuery.execute(DatabaseQuery.java:904) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:803) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:108) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:85) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2896) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1857) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1839) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:1790) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.CommitManager.commitNewObjectsForClassWithChangeSet(CommitManager.java:227) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.CommitManager.commitAllObjectsForClassWithChangeSet(CommitManager.java:194) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:139) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:4263) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1441) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithPreBuiltChangeSet(UnitOfWorkImpl.java:1587) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.writeChanges(RepeatableWriteUnitOfWork.java:455) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.jpa.EntityManagerImpl.flush(EntityManagerImpl.java:874) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
... 20 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: column "deleted" is of type smallint but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 116
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2103) ~[postgresql-9.1-901-1.jdbc4.jar:na]
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1836) ~[postgresql-9.1-901-1.jdbc4.jar:na]
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257) ~[postgresql-9.1-901-1.jdbc4.jar:na]
at org.postgresql.jdbc3.AbstractJdbc3Statement.getParameterMetaData(AbstractJdbc3Statement.java:414) ~[postgresql-9.1-901-1.jdbc4.jar:na]
at org.eclipse.persistence.platform.database.DerbyPlatform.setNullFromDatabaseField(DerbyPlatform.java:302) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.setParameterValueInDatabaseCall(DatabasePlatform.java:2477) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.databaseaccess.DatabaseCall.prepareStatement(DatabaseCall.java:797) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:621) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
... 48 common frames omitted
My argus-build.properties looks like the following during the build process
# Default settings for unit and integration tests.
build.property.persistence.unit=<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>\n\
<exclude-unlisted-classes>false</exclude-unlisted-classes>\n\
<properties>\n\
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create-tables"/>\n\
<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver"/>\n\
<property name="javax.persistence.jdbc.url" value="jdbc:postgresql://postgres:5432/argus_user"/>\n\
<property name="javax.persistence.jdbc.user" value="argus_user"/>\n\
<property name="javax.persistence.jdbc.password" value="password"/>\n\
<property name="eclipselink.ddl-generation" value="drop-and-create-tables"/>\n\
<property name="eclipselink.logging.level" value="SEVERE"/>\n\
<property name="eclipselink.logging.parameters" value="true"/>\n\
<property name="eclipselink.target-database" value="Auto"/>\n\
<property name="eclipselink.canonicalmodel.subpackage" value="unit"/>\n\
</properties>
build.property.secure.cookies=false
[email protected]
system.property.log.level=ERROR
system.property.mail.enabled=false
service.property.mq.connection.count=2
service.property.mq.endpoint=vm\://localhost?broker.persistent\=false
service.property.auth.ldap.authtype=simple
service.property.auth.ldap.endpoint=ldaps://ldaps.mycomany.com:636
service.property.auth.ldap.searchbase=OU=active,OU=users,DC=mycompany,DC=com:OU\=active,OU\=robot,DC\=mycompany,DC\=com
service.property.auth.ldap.searchdn=CN=argus_service,OU=active,OU=users,DC=mycompany,DC=com
service.property.auth.ldap.searchpwd=argus_service_password
service.property.auth.ldap.usernamefield=sAMAccountName
service.property.mail.alerturl.template=https\://localhost\:8443/argus/\#/alerts/$alertid$
service.property.mail.metricurl.template=https\://localhost\:8443/argus/\#/viewmetrics?expression\=$expression$
service.property.mail.smtp.auth=false
service.property.mail.smtp.host=smtprelay.mycompany.com
service.property.mail.smtp.starttls.enable=false
service.property.tsdb.connection.count=2
service.property.tsdb.endpoint.read=http://tsdbread.mycompany.com:4466
service.property.tsdb.endpoint.timeout=10000
service.property.tsdb.endpoint.write=http://tsdbwrite.mycompany.com:4477
service.property.cache.redis.cluster=redis0.mycompany.com:6379,redis1.mycompany.com:6389
asynchbase.property.hbase.zookeeper.connect=host1,host2:2181
and the following during runtime
# Default settings for unit and integration tests.
build.property.persistence.unit=<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>\n\
<exclude-unlisted-classes>false</exclude-unlisted-classes>\n\
<properties>\n\
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create-tables"/>\n\
<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver"/>\n\
<property name="javax.persistence.jdbc.url" value="jdbc:postgresql://postgres:5432/argus_user"/>\n\
<property name="javax.persistence.jdbc.user" value="argus_user"/>\n\
<property name="javax.persistence.jdbc.password" value="password"/>\n\
<property name="eclipselink.ddl-generation" value="drop-and-create-tables"/>\n\
<property name="eclipselink.logging.level" value="SEVERE"/>\n\
<property name="eclipselink.logging.parameters" value="true"/>\n\
<property name="eclipselink.target-database" value="DERBY"/>\n\
<property name="eclipselink.canonicalmodel.subpackage" value="unit"/>\n\
</properties>
build.property.secure.cookies=false
[email protected]
system.property.log.level=DEBUG
system.property.mail.enabled=false
# skip ldap (any user can log in with any password)
service.binding.auth=com.salesforce.dva.argus.service.auth.NoAuthService
service.property.mail.alerturl.template=https\://localhost\:8443/argus/\#/alerts/$alertid$
service.property.mail.metricurl.template=https\://localhost\:8443/argus/\#/viewmetrics?expression\=$expression$
service.property.mail.smtp.auth=false
service.property.mail.smtp.host=smtprelay.mycompany.com
service.property.mail.smtp.starttls.enable=false
service.property.tsdb.connection.count=2
service.property.tsdb.endpoint.read=http://opentsdb:4242
service.property.tsdb.endpoint.timeout=10000
service.property.tsdb.endpoint.write=http://opentsdb:4242
service.property.cache.redis.cluster=redis:6379
# kafka
service.property.mq.kafka.brokers=kafka:9092
service.property.mq.zookeeper.connect=kafka:2181
This page could use some more info and examples:
https://github.com/salesforce/Argus/wiki/Users-Resource#create-users
We are currently use Argus with openTSDB 2.3. I've seen that Argus checks the expressions with the MetricsReader and only allows aggregators up to openTSDB version 2.0, thus aggregators like "count", "last", "first", "none" and all the percentile stuff are not available for expressions. Is there a reason for that filtering or can we simply extend the aggregators list and use all openTSDB functions available in 2.3?
Best regards,
Stefan
Addressing some security vulnerabilities in dependency versions and moving version declaration to the parent POM's dependencyManagement
section.
when i use mvn test
command,
i don't kown which port is in use? Argus
[ ARGUS | *NULLSESSION* | *NULLUSER* | *NULLTXID* | 2017-02-15 11:40:49.501 | Thread-13 | ERROR ] From testing server (random state: false) for instance: InstanceSpec{dataDirectory=/var/folders/c5/cf47njj11kv8_6z7h4z22rg80000gn/T/1487130049499-0, port=2185, electionPort=50039, quorumPort=50040, deleteDataDirectoryOnClose=true, serverId=9, tickTime=-1, maxClientCnxns=-1} org.apache.curator.test.InstanceSpec@889
java.net.BindException: Address already in use
thanks
I'm having trouble with CORS errors when I deploy argus locally.
I have the following setup:
When I attempt to login from http://localhost:8082/app/#/login, I see the OPTIONS POST request to http://localhost:8081/argus/auth/login, and the following response headers
HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Set-Cookie: JSESSIONID=EDD932C66AF1271426DF42D846450BAC; Path=/argus; HttpOnly
Allow: POST,OPTIONS
Last-modified: Mon, 05 Dec 2016 12:09:13 UTC
Content-Type: application/vnd.sun.wadl+xml
Content-Length: 843
Date: Mon, 05 Dec 2016 12:09:13 GMT
From the javascript console, I see a CORS error
XMLHttpRequest cannot load http://localhost:8081/argus/auth/login. Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8082' is therefore not allowed access.
From reading stack overflow, it seems that since the web services are on a different port from the node app, javascript detects it as a different origin.
So, the question is, how do you normally deploy locally? I could add a CORS filter to the war to set Access-Control-Allow-Origin: *
, but that should probably not be set when deploying in production.
This setup should be easy to reproduce with the docker-compose file at https://github.com/rmelick/Argus/tree/docker-images/ArgusDocker/simple.
Methods like getFailCount(), getSuccessCount() return String.
Why are they not numeric ?
I see the following failure when I try to build on Ubuntu 15.10 with maven 3.3
[INFO] Compiling 242 source files to /home/rmelick/src/other/rmelick-Argus/ArgusCore/target/classes
An exception has occurred in the compiler (1.8.0_101). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program and the following diagnostic in your report. Thank you.
java.lang.IllegalStateException: endPosTable already set
at com.sun.tools.javac.util.DiagnosticSource.setEndPosTable(DiagnosticSource.java:136)
at com.sun.tools.javac.util.Log.setEndPosTable(Log.java:350)
at com.sun.tools.javac.main.JavaCompiler.parse(JavaCompiler.java:667)
at com.sun.tools.javac.main.JavaCompiler.parseFiles(JavaCompiler.java:950)
at com.sun.tools.javac.processing.JavacProcessingEnvironment$Round.<init>(JavacProcessingEnvironment.java:892)
at com.sun.tools.javac.processing.JavacProcessingEnvironment$Round.next(JavacProcessingEnvironment.java:921)
at com.sun.tools.javac.processing.JavacProcessingEnvironment.doProcessing(JavacProcessingEnvironment.java:1187)
at com.sun.tools.javac.main.JavaCompiler.processAnnotations(JavaCompiler.java:1170)
at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:856)
at com.sun.tools.javac.main.Main.compile(Main.java:523)
at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:129)
at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:138)
at org.codehaus.plexus.compiler.javac.JavaxToolsCompiler.compileInProcess(JavaxToolsCompiler.java:125)
at org.codehaus.plexus.compiler.javac.JavacCompiler.performCompile(JavacCompiler.java:169)
at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:823)
at org.apache.maven.plugin.compiler.CompilerMojo.execute(CompilerMojo.java:129)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] An unknown compilation problem occurred
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Argus .............................................. SUCCESS [ 1.107 s]
[INFO] ArgusCore .......................................... FAILURE [ 2.117 s]
[INFO] ArgusWebServices ................................... SKIPPED
[INFO] ArgusClient ........................................ SKIPPED
[INFO] ArgusSDK ........................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.685 s
[INFO] Finished at: 2016-12-05T10:56:50+01:00
[INFO] Final Memory: 45M/502M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.2:compile (default-compile) on project argus-core: Compilation failure
[ERROR] An unknown compilation problem occurred
[ERROR] -> [Help 1]
rmelick@rmelick-ld:~/src/other/rmelick-Argus$ mvn -v
Apache Maven 3.3.3
Maven home: /usr/share/maven
Java version: 1.8.0_101, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-8-oracle/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.2.0-42-generic", arch: "amd64", family: "unix"
The Getting Started page has a number of bad links (links that don't go anywhere), and a few other issues.
Salesforce is in the process of migrating all repositories from github.com/SalesforceEng to github.com/salesforce. As a part of this migration, this repository is slated to be moved as well. To request an exception or for any concerns contact @SalesforceEng/osscore or email [email protected]. There is no set date for the migration yet. For details on how GitHub repository transfers work, see here. Posted at 2016-08-23 22:45:38 PST.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.