martinsahlen / cloud-functions-python Goto Github PK
View Code? Open in Web Editor NEWGet some python in google cloud functions
Get some python in google cloud functions
maybe as a build parameter or something that you can use to select which container / python version to bundle the executable.
Looks like the Ubuntu 17.10 has dropped Python3.5 and isn't respecting the apt-get -y python3.5
installation, causing subsequent RUN commands to fail.
I've tried a few workarounds, but haven't been able to successfully run PyInstaller with various Docker configurations that have Python 3.5 enabled. Mostly running into GLIBC_2.25
not found errors.
I believe the 3.5 requirement is due to PyInstaller not having support for 3.6 yet?
Hello,
I have just started using the cloud-functions-python for the first time. I tried reading the documentation if I was configuring it wrong.
whenever I run the build command I get the permissions errors like:
IOError: [Errno 13] Permission denied: 'cloudfn/index.js'
but if i run it as root, it works fine. or if I recursively grant all permissions to all files in cloudfn folder.
I would really love, if a non root user could also build and deploy without permissions problem.
it seems that now pyinstaller supports 3.6!
http://www.pyinstaller.org/:
PyInstaller is a program that freezes (packages) Python programs into stand-alone executables, under Windows, Linux, Mac OS X, FreeBSD, Solaris and AIX. Its main advantages over similar tools are that PyInstaller works with Python 2.7 and 3.3โ3.6, it builds smaller executables thanks to transparent compression, it is fully multi-platform, and use the OS support to load the dynamic libraries, thus ensuring full compatibility.
I am new to this cloud thing and also building a python application.
This will be my first python application.
I have a python code that requires these libraries
matplotlib
numpy
OpenCV
scipy
sys
OpenCV
is installed using make
on local system, how will this library be called and used on cloud function?
Is it possible to use cloud function with these other dependent libraries? if possible how?
I came upon this repository while attempting to package a grpc and google cloud firestore project using pyinstaller, as upstream pyinstaller does not have the hooks necessary to package these. Could I add the hooks under cloudfn/hooks to upstream pyinstaller? I am new to github so I apologise if this issue is not the right place for this.
Anyone help on fixing this google auth error that is being generated, while using the emulator?
This is my index.js
file after the build
var googleAuth = require('google-auto-auth')();
//Handle Background events according to spec
function shimHandler(data) {
return new Promise((resolve, reject) => {
googleAuth.getToken(function (err, oauthToken) {
if (err) { console.log('googleAuth error')
reject()
} else {
const p = require('child_process').execFile('./dist/{{config["output_name"]}}/{{config["output_name"]}}', {
env: Object.assign(process.env, {
'GOOGLE_OAUTH_TOKEN': oauthToken,
})
});
var lastMessage;
p.stdin.setEncoding('utf-8');
//Log standard err messages to standard err
p.stderr.on('data', (err) => {
console.error(err.toString());
})
p.stdout.on('data', (out) => {
console.log(out.toString());
lastMessage = out;
})
p.on('close', (code) => {
if (code !== 0) {
//This means the shim failed / panicked. So we reject hard.
reject();
} else {
// Resolve the promise with the latest output from stdout
// In case of shimming http, this is the response object.
resolve(lastMessage);
}
});
//Write the object/message/request to the shim's stdin and signal
//End of input.
p.stdin.write(JSON.stringify(data));
p.stdin.end();
}
});
});
}
//Handle http request
function handleHttp(req, res) {
var requestBody;
console.log(req.get('content-type'))
switch (req.get('content-type')) {
case 'application/json':
requestBody = JSON.stringify(req.body);
break;
case 'application/x-www-form-urlencoded': //application/x-www-form-urlencoded
//The body parser for cloud functions does this, so just play along
//with it, sorry man! Maybe we should construct some kind of proper
//form request body? or not. let's keep it this way for now, as
//This is how cloud functions behaves.
//req.setHeader('content-type', 'application/json')
requestBody = JSON.stringify(req.body);
break;
case 'application/octet-stream':
requestBody = req.body;
break;
case 'text/plain':
requestBody = req.body;
break;
}
var fullUrl = req.protocol + '://' + req.get('host') + req.originalUrl;
console.log('Request body ');console.log(JSON.stringify(requestBody));
console.log('Request headers');console.log(JSON.stringify(req.headers));
console.log('Request method'); console.log(JSON.stringify(req.method));
console.log('Request remote_addr'); console.log(JSON.stringify(req.ip));
console.log('Req protocol'); console.log(JSON.stringify(req.protocol));
console.log('Req get host'); console.log(JSON.stringify(req.get('host')));
console.log('originalUrl'); console.log(JSON.stringify(req.originalUrl));
var httpRequest = {
'body': requestBody,
'headers': req.headers,
'method': req.method,
'remote_addr': req.ip,
'url': fullUrl
};
shimHandler(httpRequest)
.then((result) => {
console.log('should come here')
data = JSON.parse(result);
res.status(data.status_code);
res.set(data.headers)
res.send(data.body);
})
.catch((e) => {
console.log(e);
console.log('some error has occurred');
res.status(500).end();
})
}
//{% if config["trigger_http"] %}
exports['{{config["function_name"]}}'] = function(req, res) {
return handleHttp(req, res);
}//{% else %}
exports['{{config["function_name"]}}'] = function(event, callback) {
return shimHandler(event.data).then(function() {
callback();
}).catch(function() {
callback(new Error("Function failed"));
});
}//{% endif %}
This is what I get on running logs using
functions logs read
Request body
2018-03-24T06:15:58.394Z - info: "{"refID":"50","refTable":"SubInspectionImages","image":"Amir"}"
Request headers
{"connection":"close","content-length":"48","accept-encoding":"gzip, deflate","host":"localhost:8010","accept":"/","user-agent":"PostmanRuntime/7.1.1","postman-token":"7f3d37c7-44be-4347-9af1-75dceb347277","cache-control":"no-cache","content-type":"application/x-www-form-urlencoded"}
Request method
"POST"
Request remote_addr
2018-03-24T06:15:58.394Z - info: "127.0.0.1"
Req protocol
2018-03-24T06:15:58.395Z - info: "http"
Req get host
"localhost:8010"
originalUrl
"/"
2018-03-24T06:15:58.405Z - info: googleAuth error
2018-03-24T06:15:58.406Z - info: undefined
2018-03-24T06:15:58.406Z - info: some error has occurred
On POSTMAN
the cloud function is working as expected.
Only when the cloud function URL is called from a browser I'm getting this error in the logs.
Traceback (most recent call last):
File "function.py", line 29, in
File "site-packages/cloudfn/http.py", line 45, in handle_http_event
File "site-packages/cloudfn/http.py", line 12, in init
KeyError: 'body'
This is my function.py file content
from FTUL import mainP
from cloudfn.google_account import get_credentials
from cloudfn.http import handle_http_event, Response
from google.cloud import bigquery
import json
def handle_http(req):
biquery_client = bigquery.Client(credentials=get_credentials())
console.log(req)
console.log(req.method)
console.log(req.body)
if req.method == 'OPTIONS':
return Response(
status_code=200,
)
if req.method == 'POST':
rBody = json.loads(req.body)
im = rBody['image']
rID = rBody['refID']
rTable = rBody['refTable']
uAlgo = 0
empID = rBody['empID']
val = mainP(im,rID,rTable,uAlgo,empID)
return Response(
status_code=200,
)
handle_http_event(handle_http)
On the browser logs the error is being shown as a 500
error in the 'OPTIONS'
Browser is sending a req.method
as OPTIONS
but we have no body present in this req
but function is expecting a body
inside req.method
.
What's the work around for this?
I'm new to git, I know we can push the changes we made on to the master branch but i don't know how to do that. So just tying file name and line number
added
'--hidden-import', 'scipy._lib.messagestream',
tocli.py
line number 95 to support scipy
removed
req.set('content-type', 'application/json')
toindex.js
line 55 to make the http.py for html trigger to work properly
Hi ,
I am trying to run my code over cloud-function which has a dependency of pandas in it.But after deploying the function and triggering it,facing the following error:
ImportError: C extension: No module named timedeltas not built. If you want to import pandas from the source directory, you may need to run 'python setup.py build_ext --inplace --force' to build the C extensions first. Failed to execute script main
I am using
Python version 2.7
Pandas Version 0.22.0
and running below cmd:
sudo py-cloud-fn test-flask-bigquery1 http --python_version 2.7 -p -f main.py && cd cloudfn/target && gcloud beta functions deploy test-flask-bigquery1 --trigger-http --stage-bucket <bucketname> --memory 2048MB && cd ../..
Please provide the solution for this.
I have to display few messages on slack
which requires tokens
.
Here the link Tokens & Authentication
How can I do the same inside cloud function python?
Any idea?
Running using cloud-function-python
using flask
Can get the data but getting 500
error like this
How do I fix this issue?
[13] Failed to execute script function Traceback (most recent call last):
File "function.py", line 34, in File "site-packages/cloudfn/flask_handler.py", line 50, in handle_http_event File "json/init.py", line 230, in dumps File "json/encoder.py", line 198, in encode File "json/encoder.py", line 256, in iterencode File "json/encoder.py", line 179, in default TypeError: b'{\n "json": {\n "empID": "I123", \n"refID": "69", \n "refTable": "123456"\n }, \n "message": "Hello world!"\n}\n' is not JSON serializable
Anyone help me out here?
I have a python code that needs parameters from the calling client of the cloud function.
I'm new to this please help
How can the client send the data that is needed by the parameters?
How can I extract this parameter data sent by the client in cloud-function-python
?
I deployed my function.py on google cloud
I am getting NameError: name 'bigquery' is not defined
error in error logs when a hit the google cloud function URL.
I have added below code
from cloudfn.google_account import get_credentials
biquery_client = bigquery.Client(credentials=get_credentials())
I have no idea what's happening
My bigquery API is already enabled according to this post
Can anyone help?
Can't see to build and install
Processing dependencies for pycloudfn==0.1.209
Searching for google-auth==1.3.0
Reading https://pypi.python.org/simple/google-auth/
Download error on https://pypi.python.org/simple/google-auth/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:719) -- Some packages may not be found!
Couldn't retrieve index page for 'google-auth'
Scanning index of all packages (this may take a while)
Reading https://pypi.python.org/simple/
Download error on https://pypi.python.org/simple/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:719) -- Some packages may not be found!
No local packages or working download links found for google-auth==1.3.0
error: Could not find suitable distribution for Requirement.parse('google-auth==1.3.0')
seems pip is having an issue Could not fetch URL... problem confirming the ssl certificate
got it fixed locally with the comment by @lucalenardi in the above link
issue occurs when try to build install cloud function python
I followed all the steps to setup all the requirements for this cloud-function-python
I have a python file that I would like to execute. I have all the virtual environment libraries those are required in a requirements.txt file.
Can anyone guide me what should I do to execute the python file?
Has anyone else had trouble publishing to a pubsub topic from within a cloud function?
I have deployed a cloud function something along the lines of:
from google.cloud import pubsub
def pubsub_handler(message):
client = pubsub.PublisherClient()
topic = 'projects/TEST-PROJECT/topics/TEST-TOPIC'
client.publish(topic=topic, data='', test_param='test')
handle_pubsub_event(pubsub_handler)
and when I invoke the function and look at the logs, I see the following error:
Exception in 'grpc._cython.cygrpc.ssl_roots_override_callback' ignored E0424 15:48:44.194749483 17 security_connector.cc:1170] assertion failed: pem_root_certs != nullptr
I am using Python 2.7
I built it local and deployed it locally
py-cloud-fn handle_http http -f function.py --python_version 3.5
py-cloud-fn handle_http http --python_version 3.5 -p -f function.py && cd cloudfn/target && npm install && functions deploy handle_http --trigger-http && cd ../..
When I hit the url with POSTMAN
I get this
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string
at assertPath (path.js:39:11)
at Object.join (path.js:1218:7)
at Object.readLogLines (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/emulator/logs.js:33:39)
at Controller.getLogs (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/cli/controller.js:462:10)
at Object.exports.handler (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/cli/commands/logs/read.js:58:14)
at Object.runCommand (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/lib/command.js:235:44)
at Object.parseArgs [as _parseArgs] (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/yargs.js:1014:30)
at Object.runCommand (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/lib/command.js:195:96)
at Object.parseArgs [as _parseArgs] (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/yargs.js:1014:30)
at Object.parse (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/yargs.js:542:25)
I have no idea
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/emulator/logs.js
function readLogLines (filePath, linesToRead, output) {
try {
const parts = path.parse(filePath);
const files = fs
.readdirSync(parts.dir)
.filter((file) => file && file.includes(parts.name));
files.sort();
// Here, we naively select the newest log file, even if the user wants to
// display more lines than are available in the newest log file.
const rl = readline.createInterface({
input: fs.createReadStream(path.join(parts.dir, files[files.length - 1])),
terminal: false
});
const lines = [];
rl
.on('line', (line) => {
lines.push(line);
})
.on('close', () => {
lines
.slice(lines.length - linesToRead)
.forEach((line) => output(`${line}\n`));
});
} catch (err) {
if (err.code === 'ENOENT') {
output('');
return;
}
throw err;
}
}
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/cli/controller.js
/**
* Writes lines from the Emulator log file in FIFO order.
* Lines are taken from the end of the file according to the limit argument.
* That is, when limit is 10 will return the last (most recent) 10 lines from
* the log (or fewer if there are fewer than 10 lines in the log), in the order
* they were written to the log.
*
* @param {integer} limit The maximum number of lines to write
*/
getLogs (limit) {
if (!limit) {
limit = 20;
}
logs.readLogLines(this.config.logFile, limit, (val) => {
this.write(val);
});
}
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/cli/commands/logs/read.js
/**
* http://yargs.js.org/docs/#methods-commandmodule-providing-a-command-module
*/
exports.command = 'read';
exports.description = DESCRIPTION;
exports.builder = (yargs) => {
yargs
.usage(USAGE)
.options({
limit: {
alias: 'l',
default: 20,
description: 'Number of log entries to be fetched.',
type: 'number',
requiresArg: true
}
});
EXAMPLES['logs.read'].forEach((e) => yargs.example(e[0], e[1]));
};
exports.handler = (opts) => {
const controller = new Controller(opts);
let limit = 20;
if (opts && opts.limit) {
limit = parseInt(opts.limit, 10);
}
controller.getLogs(limit);
};
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/lib/command.js
// we apply validation post-hoc, so that custom
// checks get passed populated positional arguments.
if (!yargs._hasOutput()) yargs._runValidation(innerArgv, aliases, positionalMap, yargs.parsed.error)
if (commandHandler.handler && !yargs._hasOutput()) {
yargs._setHasOutput()
if (commandHandler.middlewares.length > 0) {
const middlewareArgs = commandHandler.middlewares.reduce(function (initialObj, middleware) {
return Object.assign(initialObj, middleware(innerArgv))
}, {})
Object.assign(innerArgv, middlewareArgs)
}
const handlerResult = commandHandler.handler(innerArgv)
if (handlerResult && typeof handlerResult.then === 'function') {
handlerResult.then(
null,
(error) => yargs.getUsageInstance().fail(null, error)
)
}
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/lib/command.js
// what does yargs look like after the buidler is run?
let innerArgv = parsed.argv
let innerYargs = null
let positionalMap = {}
if (command) {
currentContext.commands.push(command)
currentContext.fullCommands.push(commandHandler.original)
}
if (typeof commandHandler.builder === 'function') {
// a function can be provided, which builds
// up a yargs chain and possibly returns it.
innerYargs = commandHandler.builder(yargs.reset(parsed.aliases))
// if the builder function did not yet parse argv with reset yargs
// and did not explicitly set a usage() string, then apply the
// original command string as usage() for consistent behavior with
// options object below.
if (yargs.parsed === false) {
if (shouldUpdateUsage(yargs)) {
yargs.getUsageInstance().usage(
usageFromParentCommandsCommandHandler(parentCommands, commandHandler),
commandHandler.description
)
}
innerArgv = innerYargs ? innerYargs._parseArgs(null, null, true, commandIndex) : yargs._parseArgs(null, null, true, commandIndex)
} else {
innerArgv = yargs.parsed.argv
}
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/yargs.js
const handlerKeys = command.getCommands()
const skipDefaultCommand = argv[helpOpt] && (handlerKeys.length > 1 || handlerKeys[0] !== '$0')
if (argv._.length) {
if (handlerKeys.length) {
let firstUnknownCommand
for (let i = (commandIndex || 0), cmd; argv._[i] !== undefined; i++) {
cmd = String(argv._[i])
if (~handlerKeys.indexOf(cmd) && cmd !== completionCommand) {
setPlaceholderKeys(argv)
// commands are executed using a recursive algorithm that executes
// the deepest command first; we keep track of the position in the
// argv._ array that is currently being executed.
return command.runCommand(cmd, self, parsed, i + 1)
} else if (!firstUnknownCommand && cmd !== completionCommand) {
firstUnknownCommand = cmd
break
}
}
/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/yargs/yargs.js:542:25
freeze()
if (parseFn) exitProcess = false
console.log(typeof args)
const parsed = self._parseArgs(args, shortCircuit)
if (parseFn) parseFn(exitError, parsed, output)
unfreeze()
where did I got wrong????
I was successful in running the python code I want by deploying it on cloud function. ๐๐๐๐
This is my function.py
file content.
from FTUL import mainP
from cloudfn.google_account import get_credentials
from cloudfn.http import handle_http_event, Response
from google.cloud import bigquery
import json
def handle_http(req):
biquery_client = bigquery.Client(credentials=get_credentials())
if req.method == 'POST':
#extract information from the body of the POST request
rBody = json.loads(req.body)
im = rBody['image']
rID = rBody['refID']
rTable = rBody['refTable']
# Do the required operation from the extracted info
mainP(im,rID,rTable)
return Response(
status_code=200,
)
handle_http_event(handle_http)
The cloud function runs as expected, but always ends with error code 500
When I check the logs of google.
this is what is being printed
{
insertId: "some code"
labels: {
execution_id: "some id"
}
logName: "projects/network-api-production/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2018-03-28T11:34:48.880241330Z"
resource: {
labels: {
function_name: "handle_http"
project_id: "network-api-production"
region: "us-central1"
}
type: "cloud_function"
}
severity: "DEBUG"
textPayload: "Function execution took 5026 ms, finished with status code: 500"
timestamp: "2018-03-28T11:34:43.524350461Z"
severity: "DEBUG"
can be seen here in LogSeverity as (100) Debug or trace information
Can anyone help me resolve the issue?
@MartinSahlen Based on the logs, there is response payload. However, the function seems continue running, can't finish execute until encounter "connection error". Thus, can't get the response from the functions URL.
from cloudfn.flask_handler import handle_http_event
from flask import Flask, request, Response
app = Flask('the-function')
@app.route('/run', methods=['GET'])
def testrun():
return Response(str(123456), mimetype='text/plain')
handle_http_event(app)
If you have Python modules required by main.py
in the same directory, it seems that PyInstaller will not pick these up by default. Adding a --paths
argument to PyInstaller may do the trick.
I'll be happy to put together a PR for this.
Great work!
I used the function.py
's handle_http
as my cloud function to be built
def handle_http(req):
biquery_client = bigquery.Client(credentials=get_credentials())
if req.method == 'OPTIONS':
return JsonResponse({"status":"success"},status=200,safe=False)
if req.method == 'GET':
print("Body: ",type(req.GET.getlist('image')))
print(type(req.GET.getlist('image')),req.GET.getlist('image'))
i=req.GET.getlist('image')
i=''.join(str(e) for e in i)
print(i)
rID = req.GET.getlist('refID')
rTable = req.GET.getlist('refTable')
print(rID," ",rTable)
########### main custom function. which I am calling ##################
mainP(i,rID,rTable)
return Response(
status_code=200,
body={'key': 2},
headers={'content-type': 'application/json'},
)
handle_http_event(handle_http)
This was the index.js
generated after the build below
var googleAuth = require('google-auto-auth')();
//Handle Background events according to spec
function shimHandler(data) {
return new Promise((resolve, reject) => {
googleAuth.getToken(function (err, oauthToken) {
if (err) {
reject()
} else {
const p = require('child_process').execFile('./dist/func/func', {
env: Object.assign(process.env, {
'GOOGLE_OAUTH_TOKEN': oauthToken,
})
});
var lastMessage;
p.stdin.setEncoding('utf-8');
//Log standard err messages to standard err
p.stderr.on('data', (err) => {
console.error(err.toString());
})
p.stdout.on('data', (out) => {
console.log(out.toString());
lastMessage = out;
})
p.on('close', (code) => {
if (code !== 0) {
//This means the shim failed / panicked. So we reject hard.
reject();
} else {
// Resolve the promise with the latest output from stdout
// In case of shimming http, this is the response object.
resolve(lastMessage);
}
});
//Write the object/message/request to the shim's stdin and signal
//End of input.
p.stdin.write(JSON.stringify(data));
p.stdin.end();
}
});
});
}
//Handle http request
function handleHttp(req, res) {
var requestBody;
switch (req.get('content-type')) {
case 'application/json':
requestBody = JSON.stringify(req.body);
break;
case 'application/x-www-form-urlencoded':
//The body parser for cloud functions does this, so just play along
//with it, sorry man! Maybe we should construct some kind of proper
//form request body? or not. let's keep it this way for now, as
//This is how cloud functions behaves.
req.set('content-type', 'application/json')
requestBody = JSON.stringify(req.body);
break;
case 'application/octet-stream':
requestBody = req.body;
break;
case 'text/plain':
requestBody = req.body;
break;
}
var fullUrl = req.protocol + '://' + req.get('host') + req.originalUrl;
var httpRequest = {
'body': requestBody,
'headers': req.headers,
'method': req.method,
'remote_addr': req.ip,
'url': fullUrl
};
shimHandler(httpRequest)
.then((result) => {
data = JSON.parse(result);
res.status(data.status_code);
res.set(data.headers)
res.send(data.body);
})
.catch(() => {
res.status(500).end();
})
}
//
exports['handle_http'] = function(req, res) {
return handleHttp(req, res);
}//
When testing the url generated locally using POSTMAN
I'm getting this error
{"stack":"TypeError: req.set is not a function\n at handleHttp (/Users/santhoshdc/Documents/LaunchTestOne/cloudfn/target/index.js:55:11)\n at exports.handle_http (/Users/santhoshdc/Documents/LaunchTestOne/cloudfn/target/index.js:90:10)\n at app.use (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/supervisor/worker.js:142:11)\n at Layer.handle [as handle_request] (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/layer.js:95:5)\n at trim_prefix (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/index.js:317:13)\n at /Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/index.js:284:7\n at Function.process_params (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/index.js:335:12)\n at next (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/index.js:275:10)\n at app.use (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/supervisor/worker.js:114:7)\n at Layer.handle [as handle_request] (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/node_modules/express/lib/router/layer.js:95:5)","message":"req.set is not a function","name":"TypeError"}
@MartinSahlen or anyone else who knows what's going on here, help please?
maybe add some kind of hot reloading dev server?
This is my main.py file
Just have a function called hello
which prints hello universe
from cloudfn.google_account import get_credentials
from cloudfn.http import handle_http_event, Response
biquery_client = bigquery.Client(credentials=get_credentials())
def hello():
print("Hello Universe")
def handle_http(req):
return Response(
status_code=200,
body={'key': 2},
headers={'content-type': 'application/json'},
)
handle_http_event(handle_http)
I'm able to build with no problem
py-cloud-fn hello http
_____ _ _ __
| __ \ | | | | / _|
| |__) | _ ______ ___| | ___ _ _ __| |______| |_ _ __
| ___/ | | |______/ __| |/ _ \| | | |/ _` |______| _| '_ \
| | | |_| | | (__| | (_) | |_| | (_| | | | | | | |
|_| \__, | \___|_|\___/ \__,_|\__,_| |_| |_| |_|
__/ |
|___/
Function: hello
File: main.py
Trigger: http
Python version: 2.7
Production: False
โ Building, go grab a coffee...
โ Generating javascript...
โ Cleaning up...
Elapsed time: 8.3s
Output: ./cloudfn/target/index.js
Then I move into the the directory /cloudfn/target/index.js
and use cloud-function-emulator
RAN functions start
got
Warning: You're using Node.js v9.8.0 but Google Cloud Functions only supports v6.11.5.
Google Cloud Functions Emulator RUNNING
No functions deployed ยฏ_(ใ)_/ยฏ. Run functions deploy --help for how to deploy a function.
then RAN functions deploy hello --trigger-http
I'm getting these errors
ERROR: Function load error: Code could not be loaded.
ERROR: Does the file exists? Is there a syntax error in your code?
ERROR: Detailed stack trace: module.js:545
throw err;
^Error: Cannot find module 'google-auto-auth'
at Function.Module._resolveFilename (module.js:543:15)
at Function.Module._load (module.js:470:25)
at Module.require (module.js:593:17)
at require (internal/module.js:11:18)
at Object. (/Users/santhoshdc/Documents/Dummy python Test/cloudfn/target/index.js:1:80)
at Module._compile (module.js:649:30)
at Object.Module._extensions..js (module.js:660:10)
at Module.load (module.js:561:32)
at tryModuleLoad (module.js:501:12)
at Function.Module._load (module.js:493:3)ERROR: Error: Failed to deploy function.
at exec (/Users/santhoshdc/.nvm/versions/node/v9.8.0/lib/node_modules/@google-cloud/functions-emulator/src/cli/controller.js:126:22)
at ChildProcess.exithandler (child_process.js:280:5)
at ChildProcess.emit (events.js:180:13)
at maybeClose (internal/child_process.js:936:16)
at Socket.stream.socket.on (internal/child_process.js:353:11)
at Socket.emit (events.js:180:13)
at Pipe._handle.close [as _onclose] (net.js:538:12)
Can anyone point out what went wrong?
I'm clueless
I'm test to check the output of the function before deploying the function on the cloud function.
Can anyone explain what's happening here
I am able to send my data inside the body
of the return Response
here
My program is getting stuck
def handle_http_event(handle_fn):
req = Request(json.loads(sys.stdin.read()))
res = handle_fn(req)
if isinstance(res, Response):
sys.stdout.write(res._json_string())
else:
sys.stdout.write(Response()._json_string())
Getting stuck at req = Request(json.loads(sys.stdin.read()))
not proceeding further
I'm confused, can anyone help?
PS: I have copy pasted this in my function.py
Hi there,
Sorry if this is not the right place to post (I searched on stackoverflow but there was no py-cloud-fn tag) I deployed the "storage" sample (as is) and when running it gives this error:
15:45:43.560 test2 155785686371109 ./dist/func/func: 1: ./dist/func/func: Syntax error: "(" unexpected { insertId: "000000-eb6149ef-f09f-48aa-976d-1ec133924c51" labels: {โฆ} logName: "projects/xxxxxxxx/logs/cloudfunctions.googleapis.com%2Fcloud-functions" receiveTimestamp: "2017-09-25T22:45:52.884721958Z" resource: {โฆ} severity: "ERROR" textPayload: "./dist/func/func: 1: ./dist/func/func: Syntax error: "(" unexpected " timestamp: "2017-09-25T22:45:43.560Z" }
Anything I might be doing wrong?
Thanks,
M
I got following error when I accessed to the end point of flask handler.
2017-11-05T08:30:31.471Z - error: exec(bytecode, module.__dict__)
File "site-packages/google/cloud/storage/__init__.py", line 39, in <module>
File "/Users/seijik/.pyenv/versions/venv27/lib/python2.7/site-packages/PyInstaller/loader/pyimod03_importers.py", line 389, in load_module
exec(bytecode, module.__dict__)
File "site-packages/google/cloud/storage/bucket.py", line 24, in <module>
File "/Users/seijik/.pyenv/versions/venv27/lib/python2.7/site-packages/PyInstaller/loader/pyimod03_importers.py", line 389, in load_module
exec(bytecode, module.__dict__)
File "site-packages/google/api_core/__init__.py", line 23, in <module>
File "site-packages/pkg_resources/__init__.py", line 562, in get_distribution
File "site-packages/pkg_resources/__init__.py", line 436, in get_provider
File "site-packages/pkg_resources/__init__.py", line 981, in require
File "site-packages/pkg_resources/__init__.py", line 867, in resolve
pkg_resources.DistributionNotFound: The 'google-api-core' distribution was not found and is required by the application
Failed to execute script main
Build command is following.
py-cloud-fn video http -p -f main.py && cd cloudfn/target && gcloud beta functions deploy video --trigger-http --stage-bucket mybucket && cd ../..
Once I added following line to cloudfn/hooks/hook-google.cloud.storage.py
It was fixed on emulator. However I still have same error on production.
datas += copy_metadata('google-api-core')
Do you have any idea? I guess some google lib changed dependency but not certain at the moment.
I'm using following google libraries
google-auth==1.0.1
google-cloud==0.27.0
google-cloud-bigquery==0.28.0
google-cloud-storage==1.6.0
macOS Sierra version 10.12.6
Python 2.7.10
Thanks!
I'm using python3.5
While building the function using py-cloud-fn handle_http http -f function.py --python_version 3.5
I got these hidden import
errors
Analyzing hidden import \'htmlentitydefs\'\n4349
ERROR: Hidden import \'htmlentitydefs\' not found\n4349
INFO: Analyzing hidden import \'HTMLParser\'\n4350
ERROR: Hidden import \'HTMLParser\' not found\n4350
INFO: Analyzing hidden import \'Cookie\'\n4350
ERROR: Hidden import \'Cookie\' not found\n4350
INFO: running Analysis out00-Analysis.toc\n4376
Checked import in the official links
Note The htmlentitydefs module has been renamed to html.entities in Python 3. The 2to3 tool will automatically adapt imports when converting your sources to Python 3.
official link
We do have HTMLParser
in python3.5 as well according to the official link
We do have cookie
in the python3.5 according to this official link
These can be included using hidden imports
And I can see all these hidden imports
being added here
Then why am I still getting this error?
Is there a fix needed to be made here? If yes how can i or am I missing something here which I have done wrong?
Any pointers would really be helpful
Or not. Have tested a lot empirically to get this working. Cloud functions are still in beta with the api in flux so probably not a good idea to spend time on this before it stabilizes.
Cloud-function
I deployed works fine. In some cases the function might take longer to execute than a minute. Cloud-Function
has be set to terminate the function exactly after a minute.
I found a google documentation here that allows us to run a cloud-function
upto 9 minutes.
This is the normal code of node.js
to check the function is executed within a minute
/*** HTTP Cloud Function that may not completely
* execute due to early HTTP response
*
* @param {Object} req Cloud Function request context.
* @param {Object} res Cloud Function response context.
*/
exports.afterResponse = (req, res) => {
res.end();
// This statement may not execute
console.log('Function complete!');
};
This is the snippet below for it to run for more than a minute, this example it is set to 2 minutes.
/**
* HTTP Cloud Function that may not completely
* execute due to function execution timeout
*
* @param {Object} req Cloud Function request context.
* @param {Object} res Cloud Function response context.
*/
exports.afterTimeout = (req, res) => {
setTimeout(() => {
// May not execute if function's timeout is <2 minutes
console.log('Function running...');
res.end();
}, 120000); // 2 minute delay
};
Which code inside index.js
should I alter to extend the time?
When building the 3.5 image, it fails with:
Reading package lists...
W: The repository 'http://security.ubuntu.com/ubuntu zesty-security Release' does not have a Release file.
W: The repository 'http://archive.ubuntu.com/ubuntu zesty Release' does not have a Release file.
W: The repository 'http://archive.ubuntu.com/ubuntu zesty-updates Release' does not have a Release file.
W: The repository 'http://archive.ubuntu.com/ubuntu zesty-backports Release' does not have a Release file.
E: Failed to fetch http://security.ubuntu.com/ubuntu/dists/zesty-security/universe/source/Sources 404 Not Found [IP: 91.189.88.149 80]
E: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/zesty/universe/source/Sources 404 Not Found [IP: 91.189.88.152 80]
E: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/zesty-updates/universe/source/Sources 404 Not Found [IP: 91.189.88.152 80]
E: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/zesty-backports/universe/binary-amd64/Packages 404 Not Found [IP: 91.189.88.152 80]
E: Some index files failed to download. They have been ignored, or old ones used instead.
This is because 17.04 reached end-of-life a few weeks ago.
Merely using the 17.10 image has a new error:
Step 1/4 : FROM ubuntu:17.10
---> a8ad041f5225
Step 2/4 : RUN apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y python3.5 python3-pip && apt-get clean && rm -rf /var/lib/apt/lists/*
---> Using cache
---> e205e19f7ef7
Step 3/4 : RUN python3.5 -m pip install pip==9.0.1
---> Running in 06f16e1152a7
/bin/sh: 1: python3.5: not found
I'll try and solve it, but if anyone has any ideas...
OS:
Mac
Python:
Python3
Command:
pip3 install pycloudfn
Error:
Installing collected packages: pefile, altgraph, macholib, pyinstaller, six, python-dateutil, werkzeug, django, Jinja2, futures, pyspin, pyasn1-modules, cachetools, google-auth, pycloudfn
Found existing installation: six 1.11.0
Uninstalling six-1.11.0:
Successfully uninstalled six-1.11.0
Rolling back uninstall of six
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python3.6/site-packages/six.py'
Consider using the `--user` option or check the permissions
Build command:
py-cloud-fn pyz-function --python_version 2.7 http --production
Error:
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
I would like to store few files on the google cloud storage
as I am using cloud function python
is it possible?
Hi Martin,
First of all, congrats for the tool. It seems really promising!
I've been trying to run handle_http example, I get status: READY (everything goes fine, apparently) but get a 500 when I enter the resulting url. When I log to google cloud error reporting I got this error:
ImportError: No module named urlparse Failed to execute script function
at _import_module (site-packages/six.py:82)
at _resolve (site-packages/six.py:160)
at __get__ (site-packages/six.py:92)
at <module> (site-packages/cloudfn/http.py:5)
at load_module (/tmp/pip-build-QIE5nA/pyinstaller/PyInstaller/loader/pyimod03_importers.py:389)
at <module> (function.py:1)
My steps are:
1/ create files function.py, deploy.sh and requirements.txt
2/ create new virtualenv and install requirements
3/ launch deploy
I checked on the python console if I can import urlparse, or six.moves.urlib.parse and everything seems fine. I am running everything on python 2.7.
I am a bit lost, maybe you can help me.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.