GithubHelp home page GithubHelp logo

python-json-logger's Introduction

Build Status License Version

Overview

This library is provided to allow standard python logging to output log data as json objects. With JSON we can make our logs more readable by machines and we can stop writing custom parsers for syslog type records.

News

Hi, I see this package is quiet alive and I am sorry for ignoring it so long. I will be stepping up my maintenance of this package so please allow me a week to get things back in order (and most likely a new minor version) and I'll post and update here once I am caught up.

Installing

Pip:

pip install python-json-logger

Pypi:

https://pypi.python.org/pypi/python-json-logger

Manual:

python setup.py install

Usage

Integrating with Python's logging framework

Json outputs are provided by the JsonFormatter logging formatter. You can add the custom formatter like below:

Please note: version 0.1.0 has changed the import structure, please update to the following example for proper importing

    import logging
    from pythonjsonlogger import jsonlogger

    logger = logging.getLogger()

    logHandler = logging.StreamHandler()
    formatter = jsonlogger.JsonFormatter()
    logHandler.setFormatter(formatter)
    logger.addHandler(logHandler)

Customizing fields

The fmt parser can also be overidden if you want to have required fields that differ from the default of just message.

These two invocations are equivalent:

class CustomJsonFormatter(jsonlogger.JsonFormatter):
    def parse(self):
        return self._fmt.split(';')

formatter = CustomJsonFormatter('one;two')

# is equivalent to:

formatter = jsonlogger.JsonFormatter('%(one)s %(two)s')

You can also add extra fields to your json output by specifying a dict in place of message, as well as by specifying an extra={} argument.

Contents of these dictionaries will be added at the root level of the entry and may override basic fields.

You can also use the add_fields method to add to or generally normalize the set of default set of fields, it is called for every log event. For example, to unify default fields with those provided by structlog you could do something like this:

class CustomJsonFormatter(jsonlogger.JsonFormatter):
    def add_fields(self, log_record, record, message_dict):
        super(CustomJsonFormatter, self).add_fields(log_record, record, message_dict)
        if not log_record.get('timestamp'):
            # this doesn't use record.created, so it is slightly off
            now = datetime.utcnow().strftime('%Y-%m-%dT%H:%M:%S.%fZ')
            log_record['timestamp'] = now
        if log_record.get('level'):
            log_record['level'] = log_record['level'].upper()
        else:
            log_record['level'] = record.levelname

formatter = CustomJsonFormatter('%(timestamp)s %(level)s %(name)s %(message)s')

Items added to the log record will be included in every log message, no matter what the format requires.

Adding custom object serialization

For custom handling of object serialization you can specify default json object translator or provide a custom encoder

def json_translate(obj):
    if isinstance(obj, MyClass):
        return {"special": obj.special}

formatter = jsonlogger.JsonFormatter(json_default=json_translate,
                                     json_encoder=json.JSONEncoder)
logHandler.setFormatter(formatter)

logger.info({"special": "value", "run": 12})
logger.info("classic message", extra={"special": "value", "run": 12})

Using a Config File

To use the module with a config file using the fileConfig function, use the class pythonjsonlogger.jsonlogger.JsonFormatter. Here is a sample config file.

[loggers]
keys = root,custom

[logger_root]
handlers =

[logger_custom]
level = INFO
handlers = custom
qualname = custom

[handlers]
keys = custom

[handler_custom]
class = StreamHandler
level = INFO
formatter = json
args = (sys.stdout,)

[formatters]
keys = json

[formatter_json]
format = %(message)s
class = pythonjsonlogger.jsonlogger.JsonFormatter

Example Output

Sample JSON with a full formatter (basically the log message from the unit test). Every log message will appear on 1 line like a typical logger.

{
    "threadName": "MainThread",
    "name": "root",
    "thread": 140735202359648,
    "created": 1336281068.506248,
    "process": 41937,
    "processName": "MainProcess",
    "relativeCreated": 9.100914001464844,
    "module": "tests",
    "funcName": "testFormatKeys",
    "levelno": 20,
    "msecs": 506.24799728393555,
    "pathname": "tests/tests.py",
    "lineno": 60,
    "asctime": ["12-05-05 22:11:08,506248"],
    "message": "testing logging format",
    "filename": "tests.py",
    "levelname": "INFO",
    "special": "value",
    "run": 12
}

External Examples

python-json-logger's People

Contributors

barbarossatm avatar bringhurst avatar clarkbarz avatar colmex avatar deanq avatar deronnax avatar georgysavva avatar gregtap avatar haard avatar hguemar avatar idomozes avatar kobla avatar lalten avatar lopagela avatar louis-jaris avatar madzak avatar mrluanma avatar nelsonjchen avatar orsinium avatar philtay avatar quodlibetor avatar romain-dartigues avatar savier avatar soxofaan avatar stegayet avatar sullivanmatt avatar svisser avatar tommilligan avatar xionox avatar zebuline avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

python-json-logger's Issues

Only "%" style is supported

Since Python 3.2, formatters support new "{" and "${" styles, however JsonFormatter's parse method explicitly expects "%" notation.

This results in empty message is e.g. {message} used as logger formatting string.

Remove setuptools as a requirement

setuptools is not required to run the package, but to install the package. To install it, the user would already need to have setuptools. Having setuptools in the install_requires doesn't seem to provide any benefit, but does cause some problems with stricter packaging tools (see pypa/pipenv#1417)

json serialization enhancement

Having the items in the json output show up in the same order as the variables in the format string improves readability.

Instead of :
log_record = {}
use:
log_record = OrderedDict()

If you're worried about compatibility with older versions of python you could wrap it in a try or something.

try:
from collections import OrderedDict
except Exception:
pass

try:
log_record = OrderedDict()
except Exception:
log_record = {}

How to append logs in the current dict ?

Thanks for the nice library ! I tried your example which is given in the readme section. It's working fine.

My problem is I am using logging.FileHandler('reuben.json') to write the dump in a file. What happening is every time a log message is entering as a separate entry instead of an item of a list or dictionary.

Is it possible to create the json in the given below format :

log.error({"host": "host1", "message": "Failed to install"})
log.info({"host": "host2", "message": "successfully installed"})

json :

[{"host": "host1", "message": "Failed to install"}, {"host": "host2", "message": "successfully installed"}]

Package project as Python wheel

In addition to the source package, there is a new standard for packaging projects. As this is a pure Python project, it should be a matter of following the instructions on http://pythonwheels.com. This should make the installation faster (plus other benefits, see site).

use py.test for testing.

Would you be opened to migrate testing to py.test? I would also ship a fixture for testing then if someone uses this library and want to test custom formatter setup up will be already in place.

Examples of remapping fields?

If one wanted to apply custom transformations like remapping a field message -> msg, how would they go about doing that? This is the simplest and most common type of transformation so I think an example in the README would be great.

WARNING: '' not a valid package name; please use only.-separated package names in setup.py

I'm getting the following output when installing this package:

Downloading/unpacking python-json-logger
  Downloading python-json-logger-0.0.6.tar.gz
  Running setup.py (path:<path>/build/python-json-logger/setup.py) egg_info for package python-json-logger
    WARNING: '' not a valid package name; please use only.-separated package names in setup.py

Requirement already satisfied (use --upgrade to upgrade): setuptools in <path>/lib/python2.7/site-packages (from python-json-logger)
Installing collected packages: python-json-logger
  Running setup.py install for python-json-logger
    WARNING: '' not a valid package name; please use only.-separated package names in setup.py

Successfully installed python-json-logger
Cleaning up...

I think the values for package_dir and/or packages need to be changed for this to be fixed.

Python JSON logger logs extra fields even if they're not included in format

Hello,

This library outputs all extra data all the time, even when such fields don't exist in the format string.

I used a very minimal format to demonstrate the problem.

[formatter_default_json]
class = pythonjsonlogger.jsonlogger.JsonFormatter
format = %(asctime)s

some uvicorn output with this config:

{"asctime": "2020-09-12 22:52:48,944", "color_message": "Finished server process [\u001b[36m%d\u001b[0m]"}
{"asctime": "2020-09-12 22:52:48,944"}
{"asctime": "2020-09-12 22:52:48,944", "color_message": "Finished server process [\u001b[36m%d\u001b[0m]"}
{"asctime": "2020-09-12 22:52:49,024"}
{"asctime": "2020-09-12 22:52:49,024"}

Similarly with the access logs:

{"asctime": "2020-09-12 22:53:19,895", "status_code": 200, "scope": {"type": "http", "asgi": {"version": "3.0", "spec_version": "2.1"}, "http_version": "1.1", "server": ["127.0.0.1", 8000], "client": ["127.0.0.1", 58918], "scheme": "http", "method": "GET", "root_path": "", "path": "/", "raw_path": "b'/'", "query_string": "b''", "headers": [["b'user-agent'", "b'curl/7.29.0'"], ["b'host'", "b'localhost:8000'"], ["b'accept'", "b'*/*'"]], "fastapi_astack": "<contextlib.AsyncExitStack object at 0x7f208c8e8460>", "app": "<fastapi.applications.FastAPI object at 0x7f208d6bc430>", "router": "<fastapi.routing.APIRouter object at 0x7f208d6bc580>", "endpoint": "<function root at 0x7f208c8e4820>", "path_params": {}}}

You can see that these are all extra fields in uvicorn's source code:

repos/uvicorn/uvicorn/config.py:            extra={"color_message": color_message},
repos/uvicorn/uvicorn/main.py:        logger.info(message, process_id, extra={"color_message": color_message})
repos/uvicorn/uvicorn/main.py:            extra={"color_message": color_message},
repos/uvicorn/uvicorn/main.py:                extra={"color_message": color_message},
repos/uvicorn/uvicorn/protocols/http/h11_impl.py:                    extra={"status_code": status_code, "scope": self.scope},
repos/uvicorn/uvicorn/protocols/http/httptools_impl.py:                    extra={"status_code": status_code, "scope": self.scope},
repos/uvicorn/uvicorn/supervisors/basereload.py:        logger.info(message, extra={"color_message": color_message})
repos/uvicorn/uvicorn/supervisors/basereload.py:        logger.info(message, extra={"color_message": color_message})
repos/uvicorn/uvicorn/supervisors/multiprocess.py:        logger.info(message, extra={"color_message": color_message})
repos/uvicorn/uvicorn/supervisors/multiprocess.py:        logger.info(message, extra={"color_message": color_message})

Is there any way to exclude these fields from output via an INI config file at present?

Thanks
Fotis

Question: Handling http library logging variables like from Flask or Gunicorn

I have been using this excellent library for many different services where it works great. However, I've tried to use it for some API services that I can't get it to properly log the 'extra' variables usually associated with HTTP clients, such as status code, remote host, remote port, path and so forth. So for instance, this is an example of a log entry using the json-logging library:

{"type": "request", "written_at": "2019-04-10T08:59:49.054Z", "written_ts": 1554886789054874000, "component_id": "-", "component_name": "api", "component_instance": 0, "correlation_id": "xyz", "remote_user": "-", "request": "/api/v1/ping", "referer": "-", "protocol": "HTTP/1.0", "method": "GET", "remote_ip": "123.123.123.123", "request_size_b": -1, "remote_host": "123.123.123.123", "remote_port": "40167", "request_received_at": "2019-04-10T08:59:49.049Z", "response_time_ms": 5, "response_status": 200, "response_size_b": 15, "response_content_type": "application/json", "response_sent_at": "2019-04-10T08:59:49.054Z"}

Where as with the python-json-logger library, the log entries look more like this:

{"asctime": "2019-04-10 09:01:28,220", "name": "gunicorn.error", "levelname": "DEBUG", "message": "GET /api/v1/ping"}

I've tried to add logging filters to attach the request and response data to the logs, but they don't really show up. So for instance, like the below code, doesn't really appear to do much.

    class FilterAddContext(logging.Filter):
        def filter(self, record):
            if request:
                record.endpoint = request.endpoint or ""
                return True
            else:
                return True

I much prefer the python-json-logger library over the json-logging library, but this is a part I'm struggling to get working correctly. It works for all other logging purposes, where I just need to specify extras but I haven't been able to get this to work properly.

Hope you can provide some insights on how this should be done.

No StackTrace info on the log

Hello.

I'm trying to use this module to format django's log. The problem I'm having is that the full message of the stack trace when there's an exception, including the actual Exception name and message, is not showing up in my log files.

Any idea for how I might fix this?

sort_keys arg

A sort_keys would be nice.

I know I can implement it myself with a few lines of code, but that doesn't work from a logging config file.

Specifying extra fields to skip via INI config?

Hey there guys, I'm sorry if I've missed this but how can I configure Python JSON logger to omit fields via the INI config.

I can see that the reserved_attrs kwarg does what I need, but I haven't had any luck getting this configured in my loging INI file.

I've tried the following

[formatter_default_json]
class = pythonjsonlogger.jsonlogger.JsonFormatter
format = %(asctime)s %(name)s %(levelname)s %(message)s

reserved_attrs = color_message
OR
reserved_attrs = ['color_message']
OR
args = (reserved_attrs=['color_message'])

Thanks heaps
Fotis

Please add a changelog file

In addition to #47, it would be nice if there was a changelog file detailing each release (makes it easy to find and it's also available when offline). It could also be part of the readme.

OrderedDict causing recursion

Hi @madzak, this library is great and we're using it in production.
But when using it with tornado I can sometimes reproduce this:

Traceback (most recent call last):
  File "/usr/local/Cellar/python/2.7.8_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/logging/__init__.py", line 859, in emit
    msg = self.format(record)
  File "/usr/local/Cellar/python/2.7.8_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/logging/__init__.py", line 732, in format
    return fmt.format(record)
  File "/Users/gabrielfalcao/.virtualenvs/device_management/lib/python2.7/site-packages/pythonjsonlogger/jsonlogger.py", line 117, in format
    log_record = OrderedDict()
  File "/usr/local/Cellar/python/2.7.8_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/collections.py", line 52, in __init__
    self.__update(*args, **kwds)
RuntimeError: maximum recursion depth exceeded while calling a Python object

I don't have time to dig into this, because the output generated is actually obfuscating my actual error, but I hope that this bug report makes sense to you.

Thank you

"command" has not attribute "load"

I am making a Discord bot(which, I know is unrelated to this repository) and I'm using JSON. Whenever I try to run the commands using JSON, however, I keep getting the following error:
AttributeError: 'Command' object has no attribute 'load'
This was working just 2 days ago, and the next morning, it suddenly stopped. Here is a snippet of my code(I am using python 3.7.3):

     gp = json.load(g)
     user = "userid_" + str(userid)
     if user not in gp:
         addprofile(userid);

Doesn't respect `format` keyword

Say I define a logging config like this:

version: 1

formatters:

  json:
    (): pythonjsonlogger.jsonlogger.JsonFormatter
    format: "%(name)s %(levelname)s [%(filename)s:%(lineno)s] %(message)s"

handlers:

  console:
    class: logging.StreamHandler
    level: INFO
    formatter: json
    stream: ext://sys.stdout

root:
  level: INFO
  handlers: [ console ]

The resulting JSON includes no field that shows the log line in the format specified, including the filename and line number (lineno) before each message. Instead, it seems to parse the format string to determine that it needs to include the filename and lineno attributes in each log line.

This seems to be a deliberate design decision since it's present in the very first commit of the project (the formatterParse variable). I'm curious why you overloaded the format parameter in this way to specify what the JSON should include, and why you override the format parameter to never actually be used as it is meant to. (I'd be happy to contribute an update to the Readme once I understand your rationale.)

more verbose defaults for JsonFromatter

it looks like the default format is just '%(message)s'. It certainly works, but provides no additional info for log parsers to use.

In order for someone to get the maximum benefit with the minimum effort from this package, I'd suggest changing the default format to something a bit more verbose, like '%(message)s %(levelname)s %(funcName)s %(name)s %(pathname)s'

python 2.6 support?

I'm playing around with this, and am wondering why python 2.6 isn't supported? I tried to override the checking and installed it and it seems to run fine also on python 2.6.

Anything I should be careful with / aware of? Otherwise, could the version checking be relaxed to include 2.6?

Unable to use logger.info

Environment

OS: Alpine 3.5 and OSX 10.11.2
Python: 2.7.14
python-json-logger: v0.1.8

Problem

Unable to use logger.info, however logger.error works fine

vikas027 # python
Python 2.7.14 (default, Dec 14 2017, 15:51:29)
[GCC 6.4.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import logging
>>> from pythonjsonlogger import jsonlogger
>>> logger = logging.getLogger()
>>> logHandler = logging.StreamHandler()
>>> formatter = jsonlogger.JsonFormatter()
>>> logHandler.setFormatter(formatter)
>>> logger.addHandler(logHandler)
>>> logger.info("classic message", extra={"special": "value", "run": 12})
>>> logger.error("classic message", extra={"special": "value", "run": 12})
{"message": "classic message", "special": "value", "run": 12}

Unreleased version on PyPI?

PyPI currently offers python-json-logger==0.1.11 (https://pypi.org/project/python-json-logger/), but the latest version available in this repository is 0.1.10. Also there is no changelog for such a version, neither here nor on PyPI. So what's the matter with 0.1.11? While I don't believe that it's the case, one could believe that the package currently available through PyPI is a compromised one.

Type Annotations

I'm currently annotating a code base and wonder what the types of add_fields(self, log_record, record, message_dict) are. My current best guess:

from typing import Any, Dict
import logging

def add_fields(self, log_record, record:logging.LogRecord, message_dict: Dict[str, Any]) -> None: ...

What is log_record? Maybe a Dict?

On a more general note: What do you think about adding type annotations? As you still support Python 3.4 and 3.5, they should be added as type stubs. If you want, I can help you get started :-)

Invalid format '(levelname) (name) (message)' for '%' style

It seems like with Python 3.8 the format string needs to be specified differently.

I have this small test setup:

This is test.py:

import os
import logging
from pythonjsonlogger import jsonlogger

logger = logging.getLogger()
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter("(levelname) (name) (message)")
logHandler.setFormatter(formatter)
logger.handlers = [logHandler]
logger.setLevel(logging.INFO)

logger.info("Hello world")

and this Dockerfile:

FROM python:3.7
RUN pip install python-json-logger
COPY test.py .
CMD ["python", "test.py"]

If I run docker build -t test . && docker run -it test it works as expected, the output is {"levelname": "INFO", "name": "root", "message": "Hello world"}.

Under Python 3.8 this results in ValueError: Invalid format '(levelname) (name) (message)' for '%' style. I have to change the format line to read formatter = jsonlogger.JsonFormatter("%(levelname) %(name) %(message)"). I'm not sure if this is expected, but it might (at least) warrant an update of the documentation.

Release 2.0

Hello @madzak.

The readme says:

NOTICE: Python2.7 is only supported up to v0.1.11. The next release, v2.0, will support Python3 only

Python 2 is unsupported since 1st Jan 2020. Is there any roadmap for 2.0 or could you just make a new release supporting only Python3?

Thanks!

dictConfig does not work as expected, checked with 0.1.11

I'm trying to make jsonlogger wotk with dictconfig.
I tried this:

def setup_logging(loglevel, path):
    cfg = {
        "version": 1,
        "disable_existing_loggers": True,
        "formatters": {
            "json": {
                "class": "pythonjsonlogger.jsonlogger.JsonFormatter",
                "format": "%(asctime)s.%(msecs)03d %(levelname)s %(message)s",
            }
        },
        "handlers": {
            "file": {
                "class": "logging.FileHandler",
                "filename": path,
                "mode": "a",
                "formatter": "json",
            }
        },
        "root": {"level": getattr(logging, loglevel), "handlers": ["file"]},
    }
    logging.config.dictConfig(cfg)

setup_logging("INFO", "log")
logging.warning("bla")

And I get this in file:

2019-10-22 17:48:52,224.224 WARNING bla

missing timestamp? severity obeyed?

Hi
Just trying out this library, perhaps I'm missing something in the docs:

I don't see any timestamp or severity in the emitted log statement:

I just get {"message": "hello"}

import logging
import sys
from pythonjsonlogger import jsonlogger

logger = logging.getLogger()

logHandler = logging.StreamHandler(sys.stdout)
formatter = jsonlogger.JsonFormatter()
logHandler.setFormatter(formatter)
logger.addHandler(logHandler)
logger.setLevel("INFO")

logger.warn("hello")

Do i have to manually handle this standard stuff myself like your CustomJsonFormatter and if so, how do I get the record creating timestamp, vs creating my own which would have a delay?

Use ujson instead of standard library json

I've been trying to use ujson to serialize the records to string but I can't get it to work, I always get an error on line 172 in jsonlogger.py: ensure_ascii=self.json_ensure_ascii)
TypeError: an integer is required (got type NoneType)

In my logging handler I'm doing this:
self.formatter = CustomJsonFormatter( '(timestamp) (level) (name) (message)', json_serializer='ujson.dumps', )

I've tried passing json_ensure_ascii=True as well but that doesn't seem to have any effect

Add possibility to provide ensure_ascii parameter to json.dumps.

Utf-8 is rather common now, and i'd like to see my national messages readable after formatting. Now it looks like that:

In [20]: log.debug('Привет')
{"message": "\u041f\u0440\u0438\u0432\u0435\u0442"}

But I want it as

In [20]: log.debug('Привет')
{"message": "Привет"}

PR#63

Tag PyPI releases in Git

Would it be possible to tag all the commits for the releases that you've made on PyPI?

This would make it much easier to review code for a particular PyPI version.

Have you considered using zest.releaser to manage versions in Git and PyPI and your changelog?

Add program name and severity in prefix

Hi,

I want to send JSON logging to rsyslogd over UDP and configure rsyslogd to write those in a file.
The only way I can do that in rsyslogd is to configure it to look at the program name with the following
snippet in rsyslsog.d:

if $programname startswith 'anycast-healthchecker' then /var/log/anyacast-healthchecker.log
&~

and set prefix to anycast-healthchecker. But, the logs don't arrive in the file as rsyslog expects to see first the severity (levelname in python) and then the program name. If I set prefix to info anycast-healthchecker then rsyslogd is able to match the program name and route the log to the file.

levelname is dynamic therefore I can't use the prefix in the away I mentioned, thus I would like to be added automatically based on the severity of the logged message.

Any ideas how I can achieve this?
Thanks,
Pavlos

Custom fields

Is it possible to add custom fields? I would like to add some machine specific information (hostname for instance) for applications I have running in a cluster.

Uncaught exceptions

Does the python-json-logger support unhandled exceptions? I can see from the docs that catching an error and logging the error using exc_info works well, but what about in cases where the error isn't caught please?

Here is what I am seeing when the boto3 client throws an exception that isn't caught/handled, the traceback lines are seen as completely separate log messages:

{"asctime": "2019-08-29 08:02:29,276", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "Traceback (most recent call last):"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/usr/local/lib/python3.7/runpy.py\", line 193, in _run_module_as_main"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    \"__main__\", mod_spec)"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/usr/local/lib/python3.7/runpy.py\", line 85, in _run_code"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    exec(code, run_globals)"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/scripts/cme/irs_to_s3.py\", line 34, in <module>"}
{"asctime": "2019-08-29 08:02:29,277", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    bucket_files = [el.key for el in s3_bucket.objects.all()]"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/scripts/cme/irs_to_s3.py\", line 34, in <listcomp>"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    bucket_files = [el.key for el in s3_bucket.objects.all()]"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/boto3/resources/collection.py\", line 83, in __iter__"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    for page in self.pages():"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/boto3/resources/collection.py\", line 166, in pages"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    for page in pages:"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/paginate.py\", line 255, in __iter__"}
{"asctime": "2019-08-29 08:02:29,278", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    response = self._make_request(current_kwargs)"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/paginate.py\", line 332, in _make_request"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._method(**current_kwargs)"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/client.py\", line 357, in _api_call"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._make_api_call(operation_name, kwargs)"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/client.py\", line 648, in _make_api_call"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    operation_model, request_dict, request_context)"}
{"asctime": "2019-08-29 08:02:29,279", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/client.py\", line 667, in _make_request"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._endpoint.make_request(operation_model, request_dict)"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/endpoint.py\", line 102, in make_request"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._send_request(request_dict, operation_model)"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/endpoint.py\", line 132, in _send_request"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    request = self.create_request(request_dict, operation_model)"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/endpoint.py\", line 116, in create_request"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    operation_name=operation_model.name)"}
{"asctime": "2019-08-29 08:02:29,280", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/hooks.py\", line 356, in emit"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._emitter.emit(aliased_event_name, **kwargs)"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/hooks.py\", line 228, in emit"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self._emit(event_name, kwargs)"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/hooks.py\", line 211, in _emit"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    response = handler(**kwargs)"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/signers.py\", line 90, in handler"}
{"asctime": "2019-08-29 08:02:29,281", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    return self.sign(operation_name, request)"}
{"asctime": "2019-08-29 08:02:29,282", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/signers.py\", line 157, in sign"}
{"asctime": "2019-08-29 08:02:29,282", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    auth.add_auth(request)"}
{"asctime": "2019-08-29 08:02:29,282", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/auth.py\", line 425, in add_auth"}
{"asctime": "2019-08-29 08:02:29,285", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    super(S3SigV4Auth, self).add_auth(request)"}
{"asctime": "2019-08-29 08:02:29,285", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "  File \"/home/airflow/airflow/venv_lite_py3/lib/python3.7/site-packages/botocore/auth.py\", line 357, in add_auth"}
{"asctime": "2019-08-29 08:02:29,285", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "    raise NoCredentialsError"}
{"asctime": "2019-08-29 08:02:29,285", "filename": "bash_operator.py", "lineno": 128, "levelname": "INFO", "message": "botocore.exceptions.NoCredentialsError: Unable to locate credentials"}
{"asctime": "2019-08-29 08:02:29,374", "filename": "bash_operator.py", "lineno": 132, "levelname": "INFO", "message": "Command exited with return code 1"}

Happy to try and work through a PR. But just wanted to clarify the traceback support.

ORJSON

How to use orjson instead standard json library?
What about doing this at logging config?

Add ability to customise contents of the JSON dictionary

For a project I had to rename the field "message" in the JSON dictionary to a different name. I have subclassed jsonformatter.JsonFormatter to do this but there isn't really an elegant way to do it without copying the entire format method.

It would be great if there was a way to override a method that, after everything has been processed, would compose the JSON string based on the values it needs to do so. Is this a feature you would consider including? I'm happy to make a PR for it.

Python 3 support broken?

Line 34 of src/jsonlogger.py contains iteritems which was removed in python3

Does setup.py need to have use_2to3 = True uncommented?

Create Custom Handler(s)

Create one or more custom handlers to make JSON logging the shizzle:

It might be nice to have a straight up JSON handler that'll make sure the log "file" is an array of objects.

Also a handler that can piggy back off a datagram or socket handler to send off proper json with a restful behavior to and endpoint will also be a nice feature.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.