GithubHelp home page GithubHelp logo

dhomane / amazon-s3-cache-with-amazon-elasticache-redis Goto Github PK

View Code? Open in Web Editor NEW

This project forked from aws-samples/amazon-s3-cache-with-amazon-elasticache-redis

0.0 2.0 0.0 78 KB

This sample project illustrates how you can cache Amazon S3 objects within Amazon ElastiCache for Redis.

License: MIT No Attribution

Python 93.87% Shell 6.13%

amazon-s3-cache-with-amazon-elasticache-redis's Introduction

Caching Amazon S3 with Amazon ElastiCache for Redis

This sample project demonstrates how you can cache Amazon S3 objects with Amazon ElastiCache for Redis . This project also uses AWS CloudFormation & AWS Cloud9 as means to deploy, build and run this tutorial, although you can run this in your own environments as well.

These examples are also referenced in the following blog which provide background and context to this project. It is recommended to read the blog as a prerequisite.

Deployment

  1. Download from github, then run the following CFN template with AWS CloudFormation: cfn/S3RedisCFN.yaml

  2. Upon running the CFN, you will be prompted to enter a Subnet Id for AWS Cloud9 and Amazon ElastiCache to be launched in. Enter a subnet id to use and then click next, next, create. (Note: This step ensures that both services are running within the same availability zone for optimal performance. You can find your subnet ids within the Amazon VPC console. Be sure the subnet is associated with a compatible route table and an internet gateway for Cloud9.)

Setup and Build

  1. Upon CFN completion, take note of the generated S3 Bucket name and the Redis endpoint within the cloudformation outputs tab. Then navigate to AWS Cloud9 and open the S3RedisCache IDE environment.

  2. Within the AWS Cloud9 environment, open (+) a new terminal and clone this repository:

        (ssh) 
        git clone [email protected]:aws-samples/amazon-S3-cache-with-amazon-elasticache-redis.git 
        
        (https)
        git clone https://github.com/aws-samples/amazon-S3-cache-with-amazon-elasticache-redis.git 
    
  3. Navigate to the downloaded setup directory (/amazon-S3-cache-with-amazon-elasticache-redis/setup) and run the following script to further prepare your environment:

        cd amazon-S3-cache-with-amazon-elasticache-redis/setup
        sh s3_redis_project_setup.sh  
    
  4. Navigate to the resources directory (amazon-S3-cache-with-amazon-elasticache-redis/resources) and update the following properties within constants.py. Provide the generated resource values you captured in the cloudformation outputs:

   redishost="" (leave out the port)
   S3bucket= "" 
  1. Next right click on and run load_data.py. This will generate and load 100 objects into both Amazon S3 and Amazon ElastiCache for Redis

  2. Next right click on and run query_redis.py and query_S3.py . Then compare the generated latency (in microseconds) output.

You will notice a significant performance improvement when querying redis vs S3. This performance test is intended to be lightweight and only for illustration purposes. Your results may slightly vary based on your environment. An example comparison between the two services converted in milliseconds is as follows:

latency

Lazy-load example

A common caching technique often used is lazy loading. This approach assumes data is cached and if not, retrieves data from the origin data source, then caches the data future requests. In order to illustrate this example we must first flush the redis cache.

  1. Next right click on and run flush_redis.py (this deletes all your keys)

  2. Next right click on and run lazy_load.py found within the following directory (amazon-S3-cache-with-amazon-elasticache-redis/examples/lazyload). Upon first run, you will notice a cache miss because the object was not initially cached in redis. Run the script again and you will now notice a cache hit since the object was set into redis after the initial cache miss.

Terminate your environment

Upon running these examples, terminate your environment by the following steps:

  1. Next right click on and run delete_S3_objects.py within the (amazon-S3-cache-with-amazon-elasticache-redis/resources) directory. This will delete all your generated S3 Objects.

  2. Next, within the AWS CloudFormation console, delete the stack you launched.

amazon-s3-cache-with-amazon-elasticache-redis's People

Contributors

mikelabib avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.