GithubHelp home page GithubHelp logo

codingjoe / django-s3file Goto Github PK

View Code? Open in Web Editor NEW
76.0 10.0 14.0 737 KB

A lightweight file upload input for Django and Amazon S3

License: MIT License

Python 86.62% JavaScript 9.98% HTML 3.39%
django aws s3 django-storages aws-s3 amazon files storage heroku heroku-deployment

django-s3file's Introduction

django-s3file

A lightweight file upload input for Django and Amazon S3.

Django-S3File allows you to upload files directly AWS S3 effectively bypassing your application server. This allows you to avoid long running requests from large file uploads. This is particularly helpful for if you run your service on AWS Lambda or Heroku where you have a hard request limit.

PyPi Version Test Coverage GitHub license

Features

  • lightweight: less 200 lines
  • no JavaScript or Python dependencies (no jQuery)
  • easy integration
  • works just like the built-in
  • extendable JavaScript API

For the Nerds

sequenceDiagram
    autonumber
    actor Browser
    participant S3
    participant Middleware
    Browser->>Django: GET form view
    activate Django
    Django->>Browser: RESPONSE w/ presigned POST URL & signed middleware key
    deactivate Django
    Browser->>S3: POST large file
    activate S3
    S3->>Browser: RESPONSE AWS S3 key
    Browser->>Middleware: POST AWS S3 key (signed)
    activate Middleware
    Middleware->>S3: GET AWS S3 key
    S3->>Middleware: RESPONSE large file promise
    deactivate S3
    Middleware->>Django: request incl. large file promise
    deactivate Middleware
    activate Django
    opt only if files is procssed by Django
        Django-->>S3: GET large file
        activate S3
        S3-->>Django: RESPONSE large file
        deactivate S3
    end
    Django->>Browser: RESPONSE success
    deactivate Django
Loading

In a nutshell, we can bypass Django completely and have AWS handle the upload or any processing. Of course, if you want to do something with your file in Django, you can do so, just like before, with the added advantage, that your file is served from within your datacenter.

Installation

Make sure you have Amazon S3 storage setup correctly.

Just install S3file using pip.

pip install django-s3file
# or
pipenv install django-s3file

Add the S3File app and middleware in your settings:

# settings.py

INSTALLED_APPS = (
    '...',
    's3file',
    '...',
)

MIDDLEWARE = (
    '...',
    's3file.middleware.S3FileMiddleware',
    '...',
)

Usage

S3File automatically replaces Django's ClearableFileInput widget, you do not need to alter your code at all.

The ClearableFileInput widget is only than automatically replaced when the DEFAULT_FILE_STORAGE setting is set to django-storages' S3Boto3Storage or the dummy FileSystemStorage is enabled.

Setting up the AWS S3 bucket

Upload folder

S3File uploads to a single folder. Files are later moved by Django when they are saved to the upload_to location.

It is recommended to setup expiration for that folder, to ensure that old and unused file uploads don't add up and produce costs.

The default folder name is: tmp/s3file You can change it by changing the S3FILE_UPLOAD_PATH setting.

CORS policy

You will need to allow POST from all origins. Just add the following to your CORS policy.

[
  {
    "AllowedHeaders": [
        "*"
    ],
    "AllowedMethods": [
        "POST"
    ],
    "AllowedOrigins": [
        "*"
    ],
    "ExposeHeaders": [],
    "MaxAgeSeconds": 3000
  }
]

Progress Bar

S3File does emit progress signals that can be used to display some kind of progress bar. Signals named progress are emitted for both each individual file input as well as for the form as a whole.

The progress signal carries the following details:

console.log(event.detail)

{
    progress: 0.4725307607171312  // total upload progress of either a form or single input
    loaded: 1048576  // total upload progress of either a form or single input
    total: 2219064  // total bytes to upload
    currentFile: File {}  // file object
    currentFileName: "text.txt"  // file name of the file currently uploaded
    currentFileProgress: 0.47227834703299176  // upload progress of that file
    originalEvent: ProgressEvent {} // the original XHR onprogress event
}

The following example implements a Boostrap progress bar for upload progress of an entire form.

<div class="progress">
  <div class="progress-bar" role="progressbar" style="width: 0%;" aria-valuenow="0" aria-valuemin="0" aria-valuemax="100">0%</div>
</div>
(function () {
    var form = document.getElementsByTagName('form')[0]
    var progressBar = document.getElementsByClassName('progress-bar')[0]

    form.addEventListener('progress', function (event) {
        // event.detail.progress is a value between 0 and 1
        var percent = Math.round(event.detail.progress * 100)

        progressBar.setAttribute('style', 'width:' + percent + '%')
        progressBar.setAttribute('aria-valuenow', percent)
        progressBar.innerText = percent + '%'
    })
})()

Using S3File in development

Using S3File in development can be helpful especially if you want to use the progress signals described above. Therefore, S3File comes with a AWS S3 dummy backend. It behaves similar to the real S3 storage backend. It is automatically enabled, if the DEFAULT_FILE_STORAGE setting is set to FileSystemStorage.

To prevent users from accidentally using the FileSystemStorage and the insecure S3 dummy backend in production, there is also an additional deployment check that will error if you run Django's deployment check suite:

python manage.py check --deploy

We recommend always running the deployment check suite as part of your deployment pipeline.

Uploading multiple files

Django does have limited support for uploading multiple files. S3File fully supports this feature. The custom middleware makes ensure that files are accessible via request.FILES, even though they have been uploaded to AWS S3 directly and not to your Django application server.

Using optimized S3Boto3Storage

Since S3Boto3Storage supports storing data from any other fileobj, it uses a generalized _save function. This leads to the frontend uploading the file to S3 and then copying it byte-by-byte to perform a move operation just to rename the uploaded object. For large files this leads to additional loading times for the user.

That's why S3File provides an optimized version of this method at storages_optimized.S3OptimizedUploadStorage. It uses the more efficient copy method from S3, given that we know that we only copy from one S3 location to another.

from s3file.storages_optimized import S3OptimizedUploadStorage

class MyStorage(S3OptimizedUploadStorage):  # Subclass and use like any other storage
    default_acl = 'private'

django-s3file's People

Contributors

aaronenberg avatar agsimmons avatar amureki avatar annefly avatar codingjoe avatar dependabot[bot] avatar drakon avatar felipeespitalher avatar gitter-badger avatar herrbenesch avatar pyup-bot avatar requires avatar symroe avatar tilboerner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

django-s3file's Issues

Incorrect s3 upload path separator on windows

When django is being hosted on a windows machine (such as a local dev environment) the file is uploaded using backslashes as the path separator. So, instead of uploading to the /tmp/s3file folder the file get a single long filename with backslashes in it.

This then causes a File Does Not Exist IOError in boto3 when it tries to pull the file back.

tmp folder not appearing

Hi, I've been desperately searching for something like this and would love to make use of it for a small project I'm spending way too much time on. I followed the insanely simple setup instructions :) but can't get the tmp folder to appear in my s3 bucket, so I'm assuming I could not get it to work. I want this to work so bad, please help!

Here's my AWS settings - everything else is just like in the setup instructions, forms use clearable file input, etc. I can share repo with you if that'd help. Is just a simple app to upload photos and videos that uses a calendar to track them, am going to give to my family and friends. Problem is it takes forever to upload multiple images at once (one of the main reasons i'm so pumped about using this) and videos though and I'm afraid they won't use due this, and also it's on heroku so the 30 second timeout is a problem! I'll stop rambling...

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
AWS_ACCESS_KEY_ID = 'my key'
AWS_SECRET_ACCESS_KEY = 'secret access key'
AWS_STORAGE_BUCKET_NAME = 'bucket name'
AWS_S3_FILE_OVERWRITE = False
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
AWS_DEFAULT_ACL = None
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_LOCATION = 'static'
AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=86400',
}
AWS_STORAGE_BUCKET_NAME_MEDIA = 'media bucket name'
AWS_S3_CUSTOM_DOMAIN_MEDIA = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME_MEDIA
AWS_LOCATION_MEDIA = 'assets'

STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')
STATIC_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, AWS_LOCATION)
STATICFILES_DIRS = [
    os.path.join(BASE_DIR, 'static')
]

MEDIA_ROOT = os.path.join(BASE_DIR, '/static/media')
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN_MEDIA, AWS_LOCATION_MEDIA)

Only upload to S3 and bypass uploading to Django?

The current flow of the app is to first upload to s3 and then upload to Django, and this takes more than 30 seconds which makes my web app break because the response takes more than 30 seconds which is the server time response limit in Heroku.

I'm not sure if this can be implemented but if my understanding is correct, then there will be some heavy changes needed on the code to expect the s3 uploaded file URL instead of the file-object, please correct me if I'm wrong.

Any suggestions or pointing is really appreciated.

Adding exception to certain files?

Hello Joe,

I have been testing this in my staging production and it is working very nicely for most of the files. However, I have some ImageFields which is also using another package (django-image-cropping) and it is not playing nicely with django-s3file. It is being uploaded to tmp folder but not being moved across to the proper folder and therefore django is unable to link to the image

I was wondering whether if it's possible to provide exception to certain models, can we add like a decorator?

Thanks.

Template to Display Uploaded Images

Hi Joe,

As part of the testapp, any chance you could include a simple view & template that would display the images uploaded using s3file? Maybe where currently the upload view redirects to show the json file names of what was just uploaded - instead it could render them onto a page?

I'm sure someone with your skills could whip it up in no time (I'm told flattery can get you everywhere :) ). Kidding, but it'd really be appreciated by noobs such as myself.

Thanks

Uploading with user id as the destination folder

Hey there,

This plugin is awesome—it's sped up my uploads by roughly 30-40% in comparison to uploading through my application server.

However I'm new to AWS development and Django as well and could use your help. My team has an application that I just took over where users login through a user pool that's managed by Cognito.

When a user logs in, we get their username, store it in the session (e.g. request.session['usr'] = usr), and use that to upload their files into their private folder in our S3 bucket. (e.g. s3bucket/user_id/fileupload_folder).

So I guess I have two questions—is there anyway to append this user ID to the destination path when uploading via your plugin?

I assume that's not an option without forking the project and making my own custom implementation. As an alternative, is there a way to get the file's destination, so that I can move it into the directory of my choice in my view after it's been uploaded? I noticed there's a random directory that's made when uploading (e.g. s3/temp/_XRhseyETnW4NPAGuLDE5g/) so I'm not sure how to get the new file's destination. I'm not sure how efficient that solution is, and if I'd just be better off uploading through the app server to begin with.

Uploading file takes more than expected

Hi there

First, thanks for this awesome plugin. Works like a charm and is (almost) exactly what I need.

So maybe I'm understanding something wrong of need to configure something but I have the following scenario:

  1. I have a simple upload form where the user can upload a big file (let's say up to a couple of hundred MBs)
  2. The upload in via django-s3file and the progress loader works like a charm
  3. Then the POST method is called at the end of the JS direct upload to S3
  4. My code receives the temporary file and on save transforms the key that then is stored in the DB

The part of my code looks like this to handle the upload:

form = UploadForm(request.POST, request.FILES)
        if form.is_valid():
            upload = request.FILES['upload']
            obj = Obj()
            obj.save()

            obj.upload = upload # The upload_to needs to get the object id, so it's done in a second save call
            obj.save()

So this works. But the thing is that the last step 4) apparently takes more time at obj.upload = upload, the bigger the file is (a couple of hundred MBs took already something like 20-30s). So the user when they click "upload" see the upload progress but then have to wait still quite some time until the view actually loaded. I assume that this is because the file is "moved" on S3. But it probably is not a move but a copy, which would explain that the bigger the file is it takes longer.

So, somehow I this can't the the expected behaviour as it does upload the file directly to S3 but the advantage is only partially as the user now has to wait for the upload almost twice (upload + the time that it takes to copy the file).

Is this intended like that? Are there any good workaround or can I configure something in django-s3file or django-storages that the saving doesn't take that much time?

Thanks!

400 (Bad Request)

I set everything up according to the instructions, but I always get an error in the console.
s3file.js:43 POST https://dev32849.s3.amazonaws.com/ 400 (Bad Request)

This field is required error

With this simple setup:

direct_s3_storage = S3OptimizedUploadStorage()

class FileUpload(models.Model):
    file = models.FileField(storage=direct_s3_storage)
    
class CsvUploadForm(ModelForm):
    class Meta:
        model = FileUpload
        fields = ["file"]

class FileUploadView(FormView):
    form_class = FileUploadForm
    template_name = "upload.html"
<form method="post" enctype="multipart/form-data">    
    {% csrf_token %}
    {{ form }}
</form>
{{ form.media.js }}

I keep getting File: This field is required.

When I add {{ form.media.js }} request.FILES is empty, and "file" in request.POST, when I remove the {{ form.media.js }} request.FILES has "file" and "file" not in request.POST

I have the app and the middleware.

I've tried to figure it out myself but couldn't. I feel like I'm missing something, but I'm not entire sure what. Should I initiate my form differently?

Ability to change presigned URL conditions

I am using some custom JS to upload images, rather than the bundled s3file JS.

This is because I am using an in browser image editor to edit images before upload. The image editor can't (due to good browser / HTML restrictions) modify a file field, so I have to submit the data as base64 encoded.

The way presigned POST URLs work is that you have to specify the content encoding when creating the presigned URL, and then match it when POSTing later.

There isn't any configuration option in s3file at the moment for this, and the chances are that it's useful on a per field basis.

I've implemented this like so:

class Base64EncodedFileWidget(forms.ClearableFileInput):
    def get_conditions(self, accept):
        conditions = super().get_conditions(accept)
        conditions.append({"Content-Encoding": "base64"})
        return conditions

class myForm(forms.Form):
    image = forms.ImageField(widget=Base64EncodedFileWidget)

This works well, but it feels like it might be useful to work on a better solution for exposing these options somehow, or at least documenting something like this solution, if it's a good one.

I'm happy to work on this as a PR, but would like some direction on how best to implement it.

add progress in normal admin

Hi,

what would be the easiest way to add the progress bar to the normal admin?
I tried overwriting the admin/widgets/clearable_file_input.html, but somehow Django refuses to load my version of that file.

Files with identical base names cause storage file collision

Uncovered an interesting behaviour/bug? when uploading multiple files that have the same filename. Behaves as follows:

  • From a file upload form with multiple set to True select 2 or more files with the same exact file name (this is possible on MacOs at least if the files are in different folders, but the file chooser modal is set to a parent folder of these folders - see screenshot for example).
  • Only one of the files is actually uploaded to the tmp folder.
  • If you select n different files, then n copies of the single uploaded file are copied from the tmp folder to their final destination, so instead of n different files with the same name, you end up with n copies of just one of the files.

I haven't dug in deeper yet.

Screenshot 2022-04-10 at 8 59 47 PM

Screenshot 2022-04-10 at 9 00 50 PM

Is Python 3.9 really required

Hiya,

5.5.0 made Python 3.9 or greater a requirement. I understand keeping up with latest versions, but I don't think this codebase uses code that requires features in 3.9.

Python 3.8 is still commonly used and supported, and Django 3.2 (the oldest Django supported by this project) is still supporting back to Python 3.6.

Could you add 3.8 back in to CI and requirements? Or is there a good reason not to do this?

I'm happy to make the PR if it would help.

Initial Update

Hi 👊

This is my first visit to this fine repo, but it seems you have been working hard to keep all dependencies updated so far.

Once you have closed this issue, I'll create seperate pull requests for every update as soon as I find one.

That's it for now!

Happy merging! 🤖

temporary file on django server

Does this package creates a temporary file on django server before uploading the file to s3 ?
Or does the uploading occurs directly on amazon servers ?

Upload Progress Output

Hi, thank you for creating this middleware, it works great :)

I'm currently using this to upload large files to s3 via the UI (upwards of a couple GB). I see there was some sort of support for using a progress bar in a previous version of this plugin that was removed here #60
Is there any other sort of way in the current version to get upload progress? Right now I'm currently just exposing a div img that shows a loading gif indicator (not ideal) when the form submission occurs and the upload is progressing.

FWIW I'm not super familiar with JS and event listeners etc.. I've looked through the JS that is being executed behind the scenes but I only see one possible variable that I could maybe hook into to get some progress, window.uploading (I may be overlooking something else)? And I'm not even sure if that will help me exactly.

If you have any pointers or suggestions on how to show the client the progress of the upload - I would greatly appreciate it.

Thanks,
Phil

Large file performance

Hello, can you please tell if slow performance with large file is ok? I'm trying to upload 1+ gb files and it's take approx 20 minutes?

Large file greater than 5GB

Hi,
Thanks for awesome project.

Basically, I can upload file to s3 properly. :)
But, I am facing bad request error when I upload large file >5GB.
I checked developer mode on browser. PUT request to s3 was only one.
It seems django-storages supports multipart uploading.

Can I upload grater than 5GB file with django-s3file?

minimal speed up, files uploading to django server and not to tmp/s3file

Upload works but I still get tmp*.upload.* files created in /tmp folder on django server. The tmp/s3file upload folder doesn’t get created in the bucket. Also not seeing any improvements in upload times for medium / large sized (500Mb / 5Gb) files, however I am getting some speed up when uploading many files in a directory. For my use case I am more interested in uploading multiple files in one POST. Hope I can get some help here.

Admin file upload with `blank=True` fails due to a missing file

Hey there, thank you for this useful project.

I'm using
django ==5.0.3
django-s3file ==5.5.5
django-storages ==1.14.2

I'm using django-pictures with django-s3file to serve and store my images.
I have a model field setup like so

    image = PictureField(
        _("image"),
        upload_to=FilePattern(
            filename_pattern=("fancy-pattern),
        ),
        aspect_ratios=["1/1"],
        storage=some_s3_storage,
        blank=True,
        file_types=["WEBP", "JPEG"],
        width_field="image_width",
        height_field="image_height",
    )

Now if I try to save an instance of the model in the admin, I'm getting the following error message:

TypeError: object null is not iterable (cannot read property Symbol(Symbol.iterator))
  at Function.from(<anonymous>)
  at uploadFiles(/static/s3file/js/s3file.min.7bb6c3f3de1e.js:1:164)
  at ? (/static/s3file/js/s3file.min.7bb6c3f3de1e.js:1:2879)
  at Array.forEach(<anonymous>)
  at uploadS3Inputs(/static/s3file/js/s3file.min.7bb6c3f3de1e.js:1:2835)
  at HTMLFormElement.<anonymous>(/static/s3file/js/s3file.min.7bb6c3f3de1e.js:1:3350)

If I remove blank=True from the model definition, the upload works without a problem.
Do you have an idea what is wrong here?

secret key as bytes leads to error when AWS_SECRET_ACCESS_KEY is binary

Traceback (most recent call last):
  File "/Users/syphar/src/thermondo-backend/venv3/lib/python3.4/site-packages/s3file/views.py", line 92, in sign
    self.get_secret_access_key(),
  File "/Users/syphar/src/thermondo-backend/venv3/lib/python3.4/site-packages/s3file/views.py", line 48, in get_secret_access_key
    return binary_type(self.secret_access_key.encode('utf-8'))
AttributeError: 'bytes' object has no attribute 'encode'

File does not exist

ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
  File "django/core/handlers/exception.py", line 35, in inner
    response = get_response(request)
  File "s3file/middleware.py", line 15, in __call__
    request.FILES.setlist(field_name, list(self.get_files_from_storage(paths)))
  File "s3file/middleware.py", line 23, in get_files_from_storage
    f = default_storage.open(path)
  File "django/core/files/storage.py", line 33, in open
    return self._open(name, mode)
  File "storages/backends/s3boto3.py", line 464, in _open
    f = S3Boto3StorageFile(name, mode, self)
  File "storages/backends/s3boto3.py", line 72, in __init__
    self.obj.load()
  File "boto3/resources/factory.py", line 505, in do_action
    response = action(self, *args, **kwargs)
  File "boto3/resources/action.py", line 83, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "botocore/client.py", line 320, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "botocore/client.py", line 623, in _make_api_call
    raise error_class(parsed_response, operation_name)

CORS error when attempting to upload non-image to ImageField

If you have an ImageField, and try to upload a non-image file, the POST request initiated by s3file.js to the S3 bucket fails with a CORS error.

In Firefox, the error is Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://bucketname.s3.amazonaws.com/. (Reason: CORS request did not succeed). Status code: (null).

If you instead upload a valid image file to that same field, the request succeeds with no CORS error.

If you change the extension of a non-image file to .png, and upload it to the ImageField, there is no CORS error but the "Upload a valid image. The file you uploaded was either not an image or a corrupted image." error is displayed as expected.

My expectation is that if you try to upload a non-image file to an ImageField, it would be uploaded to tmp/s3file/ with no CORS error. The field validators would then execute, which would recognize that the uploaded file is not an image or doesn't have the correct extension, then an error would be returned and displayed to the user like normal. Alternatively, I would expect there to be no CORS error but instead a 4xx response from S3 if the content type is not allowed.

I haven't yet been able to identify why this is happening. I see that in the inputs data-fields-policy, it specifies ["starts-with", "$Content-Type", "image/"], but I don't know how this could cause a CORS error.

AWS_LOCATION or S3Boto3Storage.location not supported

Because I am using my own custom storage class inherited from S3Boto3Storage and changed the S3Boto3Storage.location (the default is: '') , the new location is not being picked up by s3file. By default, s3file uploads to <bucket_url>/tmp/s3file/ or to the folder specified by django.conf.settings.S3FILE_UPLOAD_PATH. It doesn't know about the prefix added to the path by S3Boto3Storage._normalize_name() when default_storage.location is anything other than the empty string.

My initial thought to fix is just to prepend default_store.location to upload_path in the forms.py module and then strip it out before passing to django-storages.

Documentation Improvement: Using s3file in development

I am bit confused while reading the documentation about testing s3file in development.

On one hand it says

"The ClearableFileInput widget is only than automatically replaced when the DEFAULT_FILE_STORAGE setting is set to django-storages’ S3Boto3Storage"

and on the other it says

"S3File comes with a AWS S3 dummy backend. It behaves similar to the real S3 storage backend. It is automatically enabled, if the DEFAULT_FILE_STORAGE setting is set to FileSystemStorage."

So, if I want to use the dummy S3 backend, I cannot utilise the S3FileInputMixin.

If the documentation has steps to manually add S3FileInputMixin in development, that will be helpful.
Or will have to change the code in apps.py to look for some other variable in settings to determine if we are in development mode.

Progress event listener isn't working?!

I tried implementing the progress bar, but I can seem to access the progress attribute on my form.

Also, the event listener on the README doesn't seem to trigger at all.

I've made sure that the middleware and the app are installed

I'm not sure where the problem could be, but I'm willing to contribute if I understand what needs to be done.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.