Comments (4)
I just discovered that the built-in File object sets the file_size type to "int", which can only represent a value going up to 2GB, so it is impossible to set the size of any file larger than that. So that changes the above problem from just "defensive coding" to a requirement, as the file_size field cannot be used at all for large (>2GB) files.
I did find that I can manually change the file_size field from int to float, giving me greater range. But that requires action on the users part.
from dfp_external_storage.
An issue I found with changing the file_size field to a float is that the content_length header ends up getting set to a decimal value, ie, "100304505.0". This seems to be an invalid value, resulting in the browser not knowing the full size of the file and so not showing a proper progress bar. This can be fixed by casting the file_size value to an int:
response_values["headers"].append(("Content-Length", int(doc.file_size)))
from dfp_external_storage.
Hi @khoran thanks again for your feedback and your work on this (and your pull request) :)
I just made a commit in develop
solving these issues, could you test it at your side please?: dab1b34
Comments/answers by concept:
Another small issue I encountered was in the file function. The check if not response_values['response'] fails with a KeyError if the key 'response' is not even in the response_values dictionary. So an extra guard of 'response' not in response_values or ... would be good there.
Added the extra verification as you commented if "response" not in response_values or not response_values["response"]:
. I added a log error too, to know at our side what makes that response not being set:
frappe.log_error(f"Error obtaining remote file content: {name}/{file}")
I just discovered that the built-in File object sets the file_size type to "int", which can only represent a value going up to 2GB, so it is impossible to set the size of any file larger than that. So that changes the above problem from just "defensive coding" to a requirement, as the file_size field cannot be used at all for large (>2GB) files.
I did find that I can manually change the file_size field from int to float, giving me greater range. But that requires action on the users part.
You are totally right, the INT(11) for file_size
defined in File DocType limits the max file size to 2,048 MB. It would needed a BIGINT (maybe I am missing something that @gavindsouza (sorry for mention you! :) or any other from Frappe could answer us :). By the way I just added a new setting for the S3 Connection remote_size_enabled
that allows enable the stat_object
you implemented if you use files > 2GB, so if checked we use always the dfp_file_size
that calls stat_object
. More info in the recent commit: dab1b34
@khoran, I am not "sure" about changing the Int field to Float as solution. It helped as a temporal one, but I did not add the int casting to size :).
And nothing else, thank you very much for your time Kevin, have a good weekend!
from dfp_external_storage.
Hi again @khoran just forgot to tell you that in your case you must check the new setting within bucket setup :):
from dfp_external_storage.
Related Issues (19)
- Fails when an existent file is uploaded HOT 1
- Waiting for collaborator to test for removed lines HOT 1
- import problem HOT 2
- ERPNext Report not working due to call of get_content on file in S3 HOT 4
- streamed downloads HOT 6
- attachment shearing not working HOT 17
- URL must start with http:// or https:// HOT 1
- duplicate file names HOT 8
- Repost Item Valuation currently broken when S3 used for attachments HOT 5
- Error generating Prepared Report HOT 4
- Unable to Upload Files If no storages are defined
- Nonetype has no attribute list object or put_object HOT 8
- DocType Permissions HOT 1
- accessing propery on self.dfp_external_storage_doc when it is None HOT 1
- files in cloud are opening in browser instead of downloading HOT 6
- file_url base is "file" instead of "files" HOT 8
- Feature: allow streaming file content directly to S3 HOT 2
- When simulating file losts on bucket images show 404 error HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dfp_external_storage.