fedspendingtransparency / data-act-broker-backend Goto Github PK
View Code? Open in Web Editor NEWServices that power the DATA Act's central data submission platform
License: Creative Commons Zero v1.0 Universal
Services that power the DATA Act's central data submission platform
License: Creative Commons Zero v1.0 Universal
It's a well-known fact that bcrypt will silently truncate after 72 characters. Appending a "salt" to the user's password before passing to bcrypt (which is, itself, a salted password hashing algorithm and uses a 128-bit salt derived from a CSPRNG) doesn't buy you any security.
To verify this:
A
repeated 72 times.If you're looking to add a level of security beyond what bcrypt offers, you have a few ways you can go:
hello I followed the readme but once I issued the command
docker exec -it dataact-broker-backend python dataactcore/scripts/initialize.py -i
this error pops out:
2022-04-07 21:34:18,828 INFO:dataactvalidator.scripts.load_cfda_data:Fetching CFDA file from new-url.com/cfda.csv
Traceback (most recent call last):
File "dataactcore/scripts/initialize.py", line 249, in <module>
main()
File "dataactcore/scripts/initialize.py", line 180, in main
load_domain_value_files(validator_config_path, args.force)
File "dataactcore/scripts/initialize.py", line 93, in load_domain_value_files
load_cfda_program(base_path)
File "/data-act/backend/dataactvalidator/scripts/load_cfda_data.py", line 82, in load_cfda_program
r = requests.get(S3_CFDA_FILE, allow_redirects=True)
File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 528, in request
prep = self.prepare_request(req)
File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 466, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "/usr/local/lib/python3.7/site-packages/requests/models.py", line 316, in prepare
self.prepare_url(url, params)
File "/usr/local/lib/python3.7/site-packages/requests/models.py", line 390, in prepare_url
raise MissingSchema(error)
requests.exceptions.MissingSchema: Invalid URL 'new-url.com/cfda.csv': No schema supplied. Perhaps you meant http://new-url.com/cfda.csv?
reading the code I need to set this :
usas_public_reference_url: new-url.com
usas_public_submissions_url: new-url.com
but I don't know what to put, the usaspending doesn't seem to have these files.
since I don't have direct access to all the data, how can I update my DB with the new data ? without having to rebuild it every time from historical data.
Some number of users may be affected by a known RabbitMQ issue on Windows where the program does not run but the log files remain empty. Pivotal's response to this issue: https://groups.google.com/forum/#!topic/rabbitmq-users/6nN-ORsl_gw
Files required for initializing the broker (e.g. cgac.csv, object_class.csv, program_activity.csv) should be placed in a data
folder and included in the repo (or at least provide a link to where the required files can be downloaded).
Hello,
I am running into issues uploading and validating files in a local copy of data act broker. I would like to know the sequence of APIs I need to execute to submit and validate files. I have tried /v1/submit_files/ followed by /v1/finalize_job/ but I am not if this is right. I don't see any uploaded files, neither the submission errors and warning files.
Background:
Questions:
Hello,
When using the local environment of the data act broker where can we get the most current D1, D2, SF133 and GTAS data? Is there a configuration option which allows it to access it directly from treasury? Without these files, the validation on local env will always create the relevant submission errors/warnings.
When I tried using the /v1/generate_file/ API to download D1 data, it returned the file url as '#'. Please see full response from the API below
Input:
{
"submission_id": 2,
"file_type": "D1",
"start": "01/10/2017",
"end": "12/31/2017"
}
Output:
{
"url": "#",
"status": "waiting",
"end": "12/31/2017",
"size": null,
"start": "01/10/2017",
"file_type": "D1",
"message": ""
}
When launching the validator app, I receive the following message:
However, the application does not appear to have successfully launched when looking networks stats:
The validator app is configured to run on port 3335, and the the broker application is running successfully on 3333. I also performed the "health_check.py" and launched the test validator app successfully.
When I kill the dataactvalidator app, there is an empty job queue. Is there something missing from a configuration standpoint for a local installation?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.