GithubHelp home page GithubHelp logo

rds-pgbadger's Introduction

Grade

Version

Python versions supported

License

RDS-pgBadger

Fetches RDS log files and analyzes them with pgBadger.

Prerequisites

Make sure your credentials are set in the ~/.aws/credentials file. Also, you can set a region in the ~/.aws/config file, so passing region option to the script is not needed. Last but not least, make sure you have pgbadger installed and reacheable from your $PATH.

Parameter group

You will have to configure your database parameter group.

First of all, ensure log_min_duration_statement is set to 0 or higher, else you won't have anything to be parsed.

Then you must enable some other parameters to get more information in the logs.

Parameter Value
log_checkpoints 1
log_connections 1
log_disconnections 1
log_lock_waits 1
log_temp_files 0
log_autovacuum_min_duration 0

Also make sure lc_messages is either at engine default or set to C.

For further details, please refer to Dalibo's pgBadger documentation.

Installation

You can install it using pip:

$ pip install rdspgbadger

Usage

To build a pgbadger report, just run the following (replacing instanceid by your instance ID):

$ rds-pgbadger instanceid

Options

Only the Instance ID is mandatory, but there are also other options you can use:

  • -d, --date : by default the script downloads all the available logs. By specifying a date in the format YYYY-MM-DD, you can then download only that day's logs.
  • -r, --region : by default the script use the region specified in your AWS config file. If none, or if you wish to change it, you can use this option to do so.
  • -o, --output : by default the script outputs log files and reports to the out folder. This option allows you to change it.
  • -n, --no-process : download log file(s), but do not process them with pgBadger.
  • -X, --pgbadger-args : command-line arguments to pass to pgBadger
  • --assume-role : By specifying a role you can use STS to assume a role, which is useful for cross account access with out having to setup the .config file. Format arn:aws:iam::<account_id>:<role_name>

Known issue

In spite of the great work of askainet, AWS API seems to be too instable, and sometimes download of big log files can fail. In such case retrying a few minutes later seems to work.

see pull request 10

Contribute

For any request, feel free to make a pull request or fill an issue on Github.

rds-pgbadger's People

Contributors

askainet avatar chouze avatar dependabot-support avatar fpietka avatar goetzb avatar hrishabhg avatar lgtml avatar mpettypiece avatar stickler-ci avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rds-pgbadger's Issues

Error while downloading huge log files

2019-07-24 07:35:05,150 :: INFO :: Getting logs from 2019-07-23
2019-07-24 07:35:05,352 :: INFO :: Downloading file /home/postgres/logs/vendor-live//error/postgresql.log.2019-07-23-00
2019-07-24 07:35:05,996 :: INFO :: Log truncated, retrying portion with NumberOfLines = 948
2019-07-24 07:35:08,924 :: INFO :: Log truncated, retrying portion with NumberOfLines = 852
2019-07-24 07:36:25,812 :: ERROR :: An error occurred (Throttling) when calling the DownloadDBLogFilePortion operation (reached max retries: 4): Rate exceeded

Pass extra command-line arguments to pgbadger

I'd like to be able to pass extra command-line arguments to pgbadger.

And example would be something like:

rds-pgbadger my_instance -r us-east-1 --pgbadger-args '--exclude-query "^(VACUUM|COMMIT)" --top 30'

Thoughts? I'm more than happy to create a pull request if this is something you'd be interested in merging.

1mb rds log limit

Hi,

Just wanted to ask how you got around the 1mb limit on the log file downloads?

Thanks

pgBadger required even with -n

Running the program with -n should not require pgBadger. The program fails with an error message saying pgBadger was not found.

TypeError: 'in <string>' requires string as left operand, not NoneType

I am running command "rds-pgbadger my-db-instance" but getting following exception-

016-11-04 08:29:36,529 :: INFO :: Getting all logs
Traceback (most recent call last):
  File "/usr/local/bin/rds-pgbadger", line 9, in <module>
    load_entry_point('rdspgbadger==1.0.2', 'console_scripts', 'rds-pgbadger')()
  File "/Library/Python/2.7/site-packages/package/rdspgbadger.py", line 117, in main
    get_all_logs(args.instance, args.output, args.date, args.region)
  File "/Library/Python/2.7/site-packages/package/rdspgbadger.py", line 84, in get_all_logs
    for log in (name for name in response.get("DescribeDBLogFiles")
  File "/Library/Python/2.7/site-packages/package/rdspgbadger.py", line 85, in <genexpr>
    if date in name["LogFileName"]):
TypeError: 'in <string>' requires string as left operand, not NoneType

It would seem that the error is due to missing region but my ~/.aws/config looks like this-

[default]
region = us-east-1
output = json

Python version -

Python 2.7.10 (default, Jul 30 2016, 18:31:42)
[GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.34)] on darwin

Error with large log files

Hi,

I getting this error with my logs

2018-11-30 15:44:55,663 :: INFO :: Downloading file postgres_logs/error/postgresql.log.2018-11-30-00
2018-11-30 15:44:55,735 :: INFO :: Downloading file postgres_logs/error/postgresql.log.2018-11-30-01
2018-11-30 15:44:55,825 :: INFO :: Downloading file postgres_logs/error/postgresql.log.2018-11-30-02
2018-11-30 15:44:55,967 :: INFO :: Log truncated, retrying portion with NumberOfLines = 76
2018-11-30 15:44:56,146 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1                                                                                                              
2018-11-30 14:51:08,971 :: ERROR :: An error occurred (Throttling) when calling the DownloadDBLogFilePortion operation (reached max retries: 4): Rate exceeded

I generate up to 35GB of logs daily, each chunk with about 2~3GB

Looping on log truncated

When trying to pull down the logs I get stuck in a retry loop

2019-04-08 10:10:03,806 :: INFO :: Downloading file out/error/postgresql.log.2019-04-05-18 2019-04-08 10:10:05,883 :: INFO :: Log truncated, retrying portion with NumberOfLines = 2864 2019-04-08 10:10:08,441 :: INFO :: Log truncated, retrying portion with NumberOfLines = 2838 2019-04-08 10:10:10,727 :: INFO :: Log truncated, retrying portion with NumberOfLines = 2633 2019-04-08 10:10:12,855 :: INFO :: Log truncated, retrying portion with NumberOfLines = 2528 2019-04-08 10:10:14,608 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1528 2019-04-08 10:10:42,787 :: INFO :: Log truncated, retrying portion with NumberOfLines = 776 2019-04-08 10:10:49,459 :: INFO :: Log truncated, retrying portion with NumberOfLines = 511 2019-04-08 10:10:51,049 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:52,547 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:53,443 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:54,233 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:55,068 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:56,016 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:10:57,281 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 .... <snip> 2019-04-08 10:34:36,407 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1 2019-04-08 10:34:37,384 :: INFO :: Log truncated, retrying portion with NumberOfLines = 1
and so on till I break out meaning it doesn't run pgbadger.
Is there another setting I should have on my logging side or something to do on the download side?

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.