GithubHelp home page GithubHelp logo

openscoring / openscoring-python Goto Github PK

View Code? Open in Web Editor NEW
32.0 32.0 9.0 42 KB

Python client library for the Openscoring REST web service

License: GNU Affero General Public License v3.0

Python 100.00%

openscoring-python's Introduction

Openscoring Build Status

REST web service for scoring PMML models.

Table of Contents

Features

  • Full support for PMML specification versions 3.0 through 4.4. The evaluation is handled by the JPMML-Evaluator library.
  • Simple and powerful REST API:
    • Model deployment and undeployment.
    • Model evaluation in single prediction, batch prediction and CSV prediction modes.
  • High performance and high throughput:
    • Sub-millisecond response times.
    • Request and response compression using gzip and deflate encodings.
    • Thread safe.
  • Open, extensible architecture for easy integration with proprietary systems and services:
    • User authentication and authorization.

Prerequisites

  • Java 11 or newer.

Installation

Binary release

Openscoring client and server uber-JAR files are distributed via the GitHub releases page, and the Openscoring webapp WAR file is distributed via the Maven Central repository.

This README file corresponds to latest source code snapshot. In order to follow its instructions as closely as possible, it's recommended to download the latest binary release.

The current version is 2.1.1 (25 September, 2022):

Source code snapshot

Enter the project root directory and build using Apache Maven:

mvn clean install

The build produces two uber-JAR files and a WAR file:

  • openscoring-client/target/openscoring-client-executable-2.1-SNAPSHOT.jar
  • openscoring-server/target/openscoring-server-executable-2.1-SNAPSHOT.jar
  • openscoring-webapp/target/openscoring-webapp-2.1-SNAPSHOT.war

Usage

The example PMML file DecisionTreeIris.pmml along with example JSON and CSV files can be found in the openscoring-service/src/etc directory.

Server side

Launch the executable uber-JAR file:

java -jar openscoring-server-executable-${version}.jar

By default, the REST web service is started at http://localhost:8080/openscoring. The main class org.openscoring.server.Main accepts a number of configuration options for URI customization and other purposes. Please specify --help for more information.

java -jar openscoring-server-executable-${version}.jar --help
Advanced configuration

Copy the sample Typesafe's Config configuration file openscoring-server/application.conf.sample to a new file application.conf, and customize its content to current needs. Use the config.file system property to impose changes on the JVM:

java -Dconfig.file=application.conf -jar openscoring-server-executable-${version}.jar

The local configuration overrides the default configuration that is defined in the reference REST web service configuration file openscoring-service/src/main/reference.conf. For example, the following local configuration would selectively override the list-valued networkSecurityContextFilter.adminAddresses property (treats any local or remote IP address as an admin IP address):

networkSecurityContextFilter {
	adminAddresses = ["*"]
}
Logging

Copy the sample Java Logging API configuration file openscoring-server/logging.properties.sample to a new file logging.properties, and customize its content to current needs. Use the java.util.logging.config.file system property to impose changes on the JVM:

java -Djava.util.logging.config.file=logging.properties -jar target/openscoring-server-executable-${version}.jar

Client side

Java

Replay the life cycle of a sample DecisionTreeIris model (in "REST API", see below) by launching the following Java application classes from the uber-JAR file:

java -cp openscoring-client-executable-${version}.jar org.openscoring.client.Deployer --model http://localhost:8080/openscoring/model/DecisionTreeIris --file DecisionTreeIris.pmml

java -cp openscoring-client-executable-${version}.jar org.openscoring.client.Evaluator --model http://localhost:8080/openscoring/model/DecisionTreeIris -XSepal_Length=5.1 -XSepal_Width=3.5 -XPetal_Length=1.4 -XPetal_Width=0.2

java -cp openscoring-client-executable-${version}.jar org.openscoring.client.CsvEvaluator --model http://localhost:8080/openscoring/model/DecisionTreeIris --input input.csv --output output.csv

java -cp openscoring-client-executable-${version}.jar org.openscoring.client.Undeployer --model http://localhost:8080/openscoring/model/DecisionTreeIris

The deployment and undeployment of models can be automated by launching the org.openscoring.client.DirectoryDeployer Java application class from the uber-JAR file, which listens for PMML file addition and removal events on the specified directory ("PMML directory watchdog"):

java -cp openscoring-client-executable-${version}.jar org.openscoring.client.DirectoryDeployer --model-collection http://localhost:8080/openscoring/model --dir pmml
Python

See the Openscoring-Python project.

R

See the Openscoring-R project.

REST API

Overview

Model REST API endpoints:

HTTP method Endpoint Required role(s) Description
GET /model - Get the summaries of all models
PUT /model/${id} admin Deploy a model
GET /model/${id} - Get the summary of a model
GET /model/${id}/pmml admin Download a model as a PMML document
POST /model/${id} - Evaluate data in "single prediction" mode
POST /model/${id}/batch - Evaluate data in "batch prediction" mode
POST /model/${id}/csv - Evaluate data in "CSV prediction" mode
DELETE /model/${id} admin Undeploy a model

By default, the "admin" role is granted to all HTTP requests that originate from the local network address.

In case of an error (ie. response status codes 4XX or 5XX), the response body is a JSON serialized form of an org.openscoring.common.SimpleResponse (source) object.

Java clients may use the following idiom to check if an operation succeeded or failed:

ModelResponse response = ...;

// The error condition is encoded by initializing the "message" field and leaving all other fields uninitialized
String message = response.getMessage();
if(message != null){
	throw new RuntimeException(message);
}

// Proceed as usual

Model deployment

PUT /model/${id}

Creates or updates a model.

The request body is a PMML document (indicated by content-type header text/xml or application/xml).

The response body is a JSON serialized form of an org.openscoring.common.ModelResponse (source) object.

Response status codes:

  • 200 OK. The model was updated.
  • 201 Created. A new model was created.
  • 400 Bad Request. The deployment failed permanently. The request body is not a valid and/or supported PMML document.
  • 403 Forbidden. The acting user does not have an "admin" role.
  • 500 Internal Server Error. The deployment failed temporarily.

Sample cURL invocation:

curl -X PUT --data-binary @DecisionTreeIris.pmml -H "Content-type: text/xml" http://localhost:8080/openscoring/model/DecisionTreeIris

The same, using the gzip encoding:

curl -X PUT --data-binary @DecisionTreeIris.pmml.gz -H "Content-encoding: gzip" -H "Content-type: text/xml" http://localhost:8080/openscoring/model/DecisionTreeIris

Model querying

GET /model

Gets the summaries of all models.

The response body is a JSON serialized form of an org.openscoring.common.BatchModelResponse (source) object.

Response status codes:

  • 200 OK. The model collection was queried.

Sample cURL invocation:

curl -X GET http://localhost:8080/openscoring/model
GET /model/${id}

Gets the summary of a model.

The response body is a JSON serialized form of an org.openscoring.common.ModelResponse (source) object.

Response status codes:

  • 200 OK. The model was queried.
  • 404 Not Found. The requested model was not found.

Sample cURL invocation:

curl -X GET http://localhost:8080/openscoring/model/DecisionTreeIris

Sample response:

{
	"id" : "DecisionTreeIris",
	"miningFunction" : "classification",
	"summary" : "Tree model",
	"properties" : {
		"created.timestamp" : "2015-03-17T12:41:35.933+0000",
		"accessed.timestamp" : "2015-03-21T09:35:58.582+0000",
		"file.size" : 4306,
		"file.checksum" : "e92855ed6575b75b10cc376f6a7df151d24b1793f1a034f53d9128c0aac9bb07"
	},
	"schema" : {
		"inputFields" : [
			{
				"id" : "Sepal_Length",
				"name" : "Sepal length in cm",
				"dataType" : "double",
				"opType" : "continuous",
				"values" : [ "[4.3, 7.9]" ]
			},
			{
				"id" : "Sepal_Width",
				"name" : "Sepal width in cm",
				"dataType" : "double",
				"opType" : "continuous",
				"values" : [ "[2.0, 4.4]" ]
			},
			{
				"id" : "Petal_Length",
				"name" : "Petal length in cm",
				"dataType" : "double",
				"opType" : "continuous",
				"values" : [ "[1.0, 6.9]" ]
			},
			{
				"id" : "Petal_Width",
				"name" : "Petal width in cm",
				"dataType" : "double",
				"opType" : "continuous",
				"values" : [ "[0.1, 2.5]" ]
			}
		],
		"targetFields" : [
			{
				"id" : "Species",
				"dataType" : "string",
				"opType" : "categorical",
				"values" : [ "setosa", "versicolor", "virginica" ]
			}
		],
		"outputFields" : [
			{
				"id" : "Probability_setosa",
				"dataType" : "double",
				"opType" : "continuous"
			},
			{
				"id" : "Probability_versicolor",
				"dataType" : "double",
				"opType" : "continuous"
			},
			{
				"id" : "Probability_virginica",
				"dataType" : "double",
				"opType" : "continuous"
			},
			{
				"id" : "Node_Id",
				"dataType" : "string",
				"opType" : "categorical"
			}
		]
	}
}

Field definitions are retrieved from the MiningSchema and Output elements of the PMML document. The input and group-by fields relate to the arguments attribute of the evaluation request, whereas the target and output fields relate to the result attribute of the evaluation response (see below).

GET /model/${id}/pmml

Downloads a model.

The response body is a PMML document.

Response status codes:

  • 200 OK. The model was downloaded.
  • 403 Forbidden. The acting user does not have an "admin" role.
  • 404 Not Found. The requested model was not found.

Sample cURL invocation:

curl -X GET http://localhost:8080/openscoring/model/DecisionTreeIris/pmml

Model evaluation

POST /model/${id}

Evaluates data in "single prediction" mode.

The request body is a JSON serialized form of an org.openscoring.common.EvaluationRequest (source) object.

The response body is a JSON serialized form of an org.openscoring.common.EvaluationResponse (source) object.

Response status codes:

  • 200 OK. The evaluation was successful.
  • 400 Bad Request. The evaluation failed permanently due to missing or invalid input data.
  • 404 Not Found. The requested model was not found.
  • 500 Internal Server Error. The evaluation failed temporarily.

Sample cURL invocation:

curl -X POST --data-binary @EvaluationRequest.json -H "Content-type: application/json" http://localhost:8080/openscoring/model/DecisionTreeIris

Sample request:

{
	"id" : "record-001",
	"arguments" : {
		"Sepal_Length" : 5.1,
		"Sepal_Width" : 3.5,
		"Petal_Length" : 1.4,
		"Petal_Width" : 0.2
	}
}

Sample response:

{
	"id" : "record-001",
	"results" : {
		"Species" : "setosa",
		"Probability_setosa" : 1.0,
		"Probability_versicolor" : 0.0,
		"Probability_virginica" : 0.0,
		"Node_Id" : "2"
	}
}
POST /model/${id}/batch

Evaluates data in "batch prediction" mode.

The request body is a JSON serialized form of an org.openscoring.common.BatchEvaluationRequest (source) object.

The response body is a JSON serialized form of an org.openscoring.common.BatchEvaluationResponse (source) object.

Response status codes:

  • 200 OK. The evaluation was successful.
  • 400 Bad Request. The evaluation failed permanently due to missing or invalid input data.
  • 404 Not Found. The requested model was not found.
  • 500 Internal Server Error. The evaluation failed temporarily.

Sample cURL invocation:

curl -X POST --data-binary @BatchEvaluationRequest.json -H "Content-type: application/json" http://localhost:8080/openscoring/model/DecisionTreeIris/batch

The evaluation is performed at "record" isolation level. If the evaluation of some org.openscoring.common.EvaluationRequest object fails, then the corresponding org.openscoring.common.EvaluationResponse object encodes the error condition (see above).

POST /model/${id}/csv

Evaluates data in "CSV prediction" mode.

The request body is a CSV document (indicated by content-type header text/plain). The data table must contain a data column for every input and group-by field. The ordering of data columns is not significant, because they are mapped to fields by name.

The CSV reader component detects the CSV dialect by probing ,, ; and \t as CSV delimiter characters. This detection functionality can be suppressed by supplying the value of the CSV delimiter character using the delimiterChar query parameter.

The response body is a CSV document. The data table contains a data column for every target and output field.

The first data column can be employed for row identification purposes. It will be copied over from the request data table to the response data table if its name equals to "Id" (the comparison is case insensitive) and the number of rows did not change during the evaluation.

Response status codes:

  • 200 OK. The evaluation was successful.
  • 400 Bad request. The evaluation failed permanently. The request body is not a valid and/or supported CSV document, or it contains cells with missing or invalid input data.
  • 404 Not Found. The requested model was not found.
  • 500 Internal Server Error. The evaluation failed temporarily.

Sample cURL invocation:

curl -X POST --data-binary @input.csv -H "Content-type: text/plain; charset=UTF-8" http://localhost:8080/openscoring/model/DecisionTreeIris/csv > output.csv

The same, using the gzip encoding:

curl -X POST --data-binary @input.csv.gz -H "Content-encoding: gzip" -H "Content-type: text/plain; charset=UTF-8" -H "Accept-encoding: gzip" http://localhost:8080/openscoring/model/DecisionTreeIris/csv > output.csv.gz

Sample request:

Id,Sepal_Length,Sepal_Width,Petal_Length,Petal_Width
record-001,5.1,3.5,1.4,0.2
record-002,7,3.2,4.7,1.4
record-003,6.3,3.3,6,2.5

Sample response:

Id,Species,Probability_setosa,Probability_versicolor,Probability_virginica,Node_Id
record-001,setosa,1.0,0.0,0.0,2
record-002,versicolor,0.0,0.9074074074074074,0.09259259259259259,6
record-003,virginica,0.0,0.021739130434782608,0.9782608695652174,7

The evaluation is performed at "all-records-or-nothing" isolation level. If the evaluation of some row fails, then the whole CSV document fails.

Model undeployment

DELETE /model/${id}

Deletes a model.

The response body is a JSON serialized form of an org.openscoring.common.SimpleResponse (source) object.

Response status codes:

  • 200 OK. The model was deleted.
  • 403 Forbidden. The acting user does not have an "admin" role.
  • 404 Not Found. The requested model was not found.
  • 500 Internal Server Error. The undeployment failed temporarily.

Sample cURL invocation:

curl -X DELETE http://localhost:8080/openscoring/model/DecisionTreeIris

An HTTP PUT or DELETE method can be masked as an HTTP POST method by using the HTTP method override mechanism.

Sample cURL invocation that employs the X-HTTP-Method-Override request header:

curl -X POST -H "X-HTTP-Method-Override: DELETE" http://localhost:8080/openscoring/model/DecisionTreeIris

Sample cURL invocation that employs the _method query parameter:

curl -X POST http://localhost:8080/openscoring/model/DecisionTreeIris?_method=DELETE

Documentation

Support

Limited public support is available via the JPMML mailing list.

License

Openscoring is licensed under the terms and conditions of the GNU Affero General Public License, Version 3.0. For a quick summary of your rights ("Can") and obligations ("Cannot" and "Must") under AGPLv3, please refer to TLDRLegal.

If you would like to use Openscoring in a proprietary software project, then it is possible to enter into a licensing agreement which makes it available under the terms and conditions of the BSD 3-Clause License instead.

Additional information

Openscoring is developed and maintained by Openscoring Ltd, Estonia.

Interested in using Java PMML API or Openscoring REST API software in your company? Please contact [email protected]

openscoring-python's People

Contributors

vruusmann avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

openscoring-python's Issues

No JSON Object could be decoded

I'm running this project in Python 2.7 and it gives me this error while uploading to the PMML model.

os.deploy("Iris", "C:\Users\moham\Documents\GitHub\openscoring-python\DecisionTreeIris.pmml", **kwargs)
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\moham\AppData\Roaming\Python\Python27\site-packages\openscoring_init_.py", line 84, in deploy
modelResponse = ModelResponse(**json.loads(response.text))
File "C:\Python27\lib\json_init_.py", line 339, in loads
return _default_decoder.decode(s)
File "C:\Python27\lib\json\decoder.py", line 364, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Python27\lib\json\decoder.py", line 382, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

Connection refused

I'm receiving an error when I try to run the example code:

from openscoring import Openscoring
os = Openscoring("http://localhost:8080/openscoring")
kwargs = {"auth" : ("admin", "adminadmin")}
os.deployFile("Iris", "model.pmml", **kwargs)

But I get the following error:

Traceback (most recent call last):
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connection.py", line 159, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/util/connection.py", line 80, in create_connection
raise err
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/util/connection.py", line 70, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/usr/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connection.py", line 181, in connect
conn = self._new_conn()
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connection.py", line 168, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f84c98f3668>: Failed to establish a new connection: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/urllib3/util/retry.py", line 398, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /openscoring/model/Iris (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f84c98f3668>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/builds/weather/test.py", line 4, in
os.deployFile("Iris", "model.pmml", **kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/openscoring/init.py", line 118, in deployFile
return self.deploy(id, instream, **kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/openscoring/init.py", line 112, in deploy
response = self._check_response(requests.put(self._model_url(id), **kwargs))
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/api.py", line 131, in put
return request('put', url, data=data, **kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/home/user/builds/weather/venv/lib/python3.7/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /openscoring/model/Iris (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f84c98f3668>: Failed to establish a new connection: [Errno 111] Connection refused'))

Process finished with exit code 1

The `Openscoring.deploy` method throws exception "No Json Object could be decoded"

Hey I was trying to use your openscoring-python library to score a simple PMML model infact the exact same model that you have developed in sklearn2pmml library (DecisionTreeIris.pmml). The final aim is to develop a PMML model in R and score it in python using your openscoring library. When I run the code its giving a error in the os.deploy line - No Json Object could be decoded
Could you please help on what is going on here? Thanks !! (Below is the attached code and I am using Jupytr notebook !)
import pandas

path='/projects/wakari/MT_SNAP_Eligibility/SNAP_Generic_Modules/Module_6_Generic_PMML_Creation/'
iris_df = pandas.read_csv(path + 'iris.csv',header=0)

from sklearn2pmml import PMMLPipeline
from sklearn.tree import DecisionTreeClassifier

iris_pipeline = PMMLPipeline([
("classifier", DecisionTreeClassifier())
])
iris_pipeline.fit(iris_df[iris_df.columns.difference(["Species"])], iris_df["Species"])

from sklearn2pmml import sklearn2pmml

sklearn2pmml(iris_pipeline, "DecisionTreeIris.pmml", with_repr = True)

import openscoring

os = openscoring.Openscoring("http://localhost:8080/openscoring")

kwargs = {"auth" : ("admin", "adminadmin")}

RUNS FINE TILL HERE

os.deploy("Iris", "DecisionTreeIris.pmml", **kwargs)

arguments = {
"Sepal_Length" : 5.1,
"Sepal_Width" : 3.5,
"Petal_Length" : 1.4,
"Petal_Width" : 0.2
}

result = os.evaluate("Iris", arguments)
print(result)

Batch csv evaluation returns NULL id's

I sent the /csv endpoint a features.csv file with "id" in the header. According to the documentation, this should result in the id's being reprinted in the response (and output csv). However, I'm still seeing id=Null in the response and no id column in the resulting csv.

My input file looks something like

"id","date","feat1","feat2",...,"featn"
"23029854","2017-01-01","2.0","1.0",..."5.0"

My code is something like:

os = openscoring.Openscoring("http://{host}:8080/openscoring".format(host=HOSTS[args.host]))
os.deploy(model_name, model_pmml, **openscoring_login)
os.evaluateCsv(model_name, input_csv, output_csv)

Then the output file:

true_label,probability_0,probability_1
0,0.9991488452182992,8.511547817011017E-4
0,0.9779992879224841,0.022000712077515847
0,0.9779992879224841,0.022000712077515847

And the EvaluationResponse:

INFO: Returned EvaluationResponse{id=null, result={true_label=0, probability_0=0.9349664396728562, probability_1=0.0650335603271438}}

I tried modifying the "id" column even though the docs say reading "id" is case-insensitive", and removing the double quotes. Is there something else I might be doing wrong?

Thanks

Openscoring for Regression Pmml model prediction

I am trying to run linear regression model using my input and pmml files; unfortunately, it seems openscoring only support classification models.

def openscoring():
    os = Openscoring("http://localhost:8080/openscoring")
    # # A dictionary of user-specified parameters
    kwargs = {"auth": ("admin", "adminadmin")}
    pmml_file = "./resources/LinearRegression.pmml"
    input_file = "./resources/test.csv"
    output_file = "./resources/result.csv"

    os.deployFile("regression", pmml_file, **kwargs)
    os.evaluateCsvFile("regression", input_file, output_file)

    os.undeploy("regression", **kwargs)

if __name__ == "__main__":
    openscoring()

My input file just contain class

And I get the following information on the SERVER terminal

org.openscoring.service.ModelResource evaluate
INFO: Returned EvaluationResponse{id=null, result={class=null}}

The web server at http://localhost:8080/openscoring did not identify itself as Openscoring/2.0 service

Hello, I am trying to do iris example by PMML in Python
image

What I understand to run openscoring, we need the openscoring server that run on http://localhost:8080/openscoring/

But when I run, i encountered this error saying like the server doesn't provide the right thing
image

When I use my browser to see http://localhost:8080/openscoring/ , I see there is no message.
image

Could you help me figure out what is wrong? I tried to run the server from
openscoring-server-executable-1.4.3.jar .

How to evaluate model with many records at once?

The example provided uses just one record:

arguments = {
	"Sepal_Length" : 5.1,
	"Sepal_Width" : 3.5,
	"Petal_Length" : 1.4,
	"Petal_Width" : 0.2
}

result = os.evaluate("Iris", arguments)
print(result)

I was wondering if there was a way to do this but with many records, and to call the Openscorer just once. For example, if I have something like:

arguments = [
{
	"Sepal_Length" : 5.1,
	"Sepal_Width" : 3.5,
	"Petal_Length" : 1.4,
	"Petal_Width" : 0.2
},
{
	"Sepal_Length" : 5.0,
	"Sepal_Width" : 3.7,
	"Petal_Length" : 1.2,
	"Petal_Width" : 0.4
}
]

Is there a way to evaluate the model with both records at once?

ConnectionError: HTTPConnectionPool(host='localhost', port=8080)

Hello, I am trying to use Openscoring as a web service to get APIs. I am very new to this so I do hope to get some help.

from openscoring import Openscoring

os = Openscoring(base_url = "http://localhost:8080/openscoring")

This part runs on Jupyter Notebook without issue, however attempting to go to the base url, results in the site not being able to be reached. The main error will be in the next line of code for deploying the pmml file.

os.deployFile("Traffic", "lr_model.pmml")

The full error message is as stated:

ConnectionRefusedError                    Traceback (most recent call last)
D:\anaconda3\lib\site-packages\urllib3\connection.py in _new_conn(self)
    168         try:
--> 169             conn = connection.create_connection(
    170                 (self._dns_host, self.port), self.timeout, **extra_kw

D:\anaconda3\lib\site-packages\urllib3\util\connection.py in create_connection(address, timeout, source_address, socket_options)
     95     if err is not None:
---> 96         raise err
     97 

D:\anaconda3\lib\site-packages\urllib3\util\connection.py in create_connection(address, timeout, source_address, socket_options)
     85                 sock.bind(source_address)
---> 86             sock.connect(sa)
     87             return sock

ConnectionRefusedError: [WinError 10061] No connection could be made because the target machine actively refused it

During handling of the above exception, another exception occurred:

NewConnectionError                        Traceback (most recent call last)
D:\anaconda3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    698             # Make the request on the httplib connection object.
--> 699             httplib_response = self._make_request(
    700                 conn,

D:\anaconda3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    393             else:
--> 394                 conn.request(method, url, **httplib_request_kw)
    395 

D:\anaconda3\lib\site-packages\urllib3\connection.py in request(self, method, url, body, headers)
    233             headers["User-Agent"] = _get_default_user_agent()
--> 234         super(HTTPConnection, self).request(method, url, body=body, headers=headers)
    235 

D:\anaconda3\lib\http\client.py in request(self, method, url, body, headers, encode_chunked)
   1254         """Send a complete request to the server."""
-> 1255         self._send_request(method, url, body, headers, encode_chunked)
   1256 

D:\anaconda3\lib\http\client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1300             body = _encode(body, 'body')
-> 1301         self.endheaders(body, encode_chunked=encode_chunked)
   1302 

D:\anaconda3\lib\http\client.py in endheaders(self, message_body, encode_chunked)
   1249             raise CannotSendHeader()
-> 1250         self._send_output(message_body, encode_chunked=encode_chunked)
   1251 

D:\anaconda3\lib\http\client.py in _send_output(self, message_body, encode_chunked)
   1009         del self._buffer[:]
-> 1010         self.send(msg)
   1011 

D:\anaconda3\lib\http\client.py in send(self, data)
    949             if self.auto_open:
--> 950                 self.connect()
    951             else:

D:\anaconda3\lib\site-packages\urllib3\connection.py in connect(self)
    199     def connect(self):
--> 200         conn = self._new_conn()
    201         self._prepare_conn(conn)

D:\anaconda3\lib\site-packages\urllib3\connection.py in _new_conn(self)
    180         except SocketError as e:
--> 181             raise NewConnectionError(
    182                 self, "Failed to establish a new connection: %s" % e

NewConnectionError: <urllib3.connection.HTTPConnection object at 0x0000014AB7CBFEB0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
D:\anaconda3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    438             if not chunked:
--> 439                 resp = conn.urlopen(
    440                     method=request.method,

D:\anaconda3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    754 
--> 755             retries = retries.increment(
    756                 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]

D:\anaconda3\lib\site-packages\urllib3\util\retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    573         if new_retry.is_exhausted():
--> 574             raise MaxRetryError(_pool, url, error or ResponseError(cause))
    575 

MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /openscoring/model/Traffic (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000014AB7CBFEB0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
<ipython-input-3-38aa5e890439> in <module>
----> 1 os.deployFile("Traffic", "lr_model.pmml")

D:\anaconda3\lib\site-packages\openscoring\__init__.py in deployFile(self, id, file, **kwargs)
     77         def deployFile(self, id, file, **kwargs):
     78                 with open(file, "rb") as instream:
---> 79                         return self.deploy(id, instream, **kwargs)
     80 
     81         def evaluate(self, id, payload = {}, **kwargs):

D:\anaconda3\lib\site-packages\openscoring\__init__.py in deploy(self, id, pmml, **kwargs)
     71         def deploy(self, id, pmml, **kwargs):
     72                 kwargs = _merge_dicts(kwargs, data = pmml, json = None, auth = self.auth, headers = {"content-type" : "application/xml"})
---> 73                 response = self._check_response(requests.put(self._model_url(id), **kwargs))
     74                 modelResponse = ModelResponse(**json.loads(response.text))
     75                 return modelResponse.ensureSuccess()

D:\anaconda3\lib\site-packages\requests\api.py in put(url, data, **kwargs)
    132     """
    133 
--> 134     return request('put', url, data=data, **kwargs)
    135 
    136 

D:\anaconda3\lib\site-packages\requests\api.py in request(method, url, **kwargs)
     59     # cases, and look like a memory leak in others.
     60     with sessions.Session() as session:
---> 61         return session.request(method=method, url=url, **kwargs)
     62 
     63 

D:\anaconda3\lib\site-packages\requests\sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    540         }
    541         send_kwargs.update(settings)
--> 542         resp = self.send(prep, **send_kwargs)
    543 
    544         return resp

D:\anaconda3\lib\site-packages\requests\sessions.py in send(self, request, **kwargs)
    653 
    654         # Send the request
--> 655         r = adapter.send(request, **kwargs)
    656 
    657         # Total elapsed time of the request (approximately)

D:\anaconda3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    514                 raise SSLError(e, request=request)
    515 
--> 516             raise ConnectionError(e, request=request)
    517 
    518         except ClosedPoolError as e:

ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /openscoring/model/Traffic (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000014AB7CBFEB0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))

I sincerely apologize if this is a dumb issue, but I do not know why or what is wrong. I have also ran spark-shell --jars D:\spark-3.0.2-bin-hadoop2.7\bin\jpmml-sparkml-executable-${1.6.5}.jar with no issues. I am currently using Spark 3.0.2 and Python 3.8.8. Thank you for your time and considerations. Please do let me know if more information is required from me.

How to get prediction probabilities?

Suppose we have any PMML classifier and we evaluate it using this library. Then, the output for each observation is the class the model predicts they belong to. However, I'd like to know the probability of each observation to belong to each class as well.

Is there a way to achieve this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.