GithubHelp home page GithubHelp logo

ex-aws / ex_aws Goto Github PK

View Code? Open in Web Editor NEW
1.3K 1.3K 521.0 1.96 MB

A flexible, easy to use set of clients AWS APIs for Elixir

Home Page: https://hex.pm/packages/ex_aws

License: MIT License

Elixir 100.00%

ex_aws's Introduction

ExAws

GitHub Workflow Status hex.pm hex.pm hex.pm hexdocs.pm github.com

A flexible easy to use set of AWS APIs.

Available Services: https://github.com/ex-aws?q=service&type=&language=

Getting Started

ExAws v2.0 breaks out every service into its own package. To use the S3 service, you need both the core :ex_aws package as well as the :ex_aws_s3 package.

As with all ExAws services, you'll need a compatible HTTP client (defaults to :hackney) and whatever JSON or XML codecs needed by the services you want to use. Consult individual service documentation for details on what each service needs.

defp deps do
  [
    {:ex_aws, "~> 2.1"},
    {:ex_aws_s3, "~> 2.0"},
    {:hackney, "~> 1.9"},
    {:sweet_xml, "~> 0.6"},
  ]
end

With these deps you can use ExAws precisely as you're used to:

# make a request (with the default region)
ExAws.S3.list_objects("my-bucket") |> ExAws.request()

# or specify the region
ExAws.S3.list_objects("my-bucket") |> ExAws.request(region: "us-west-1")

# some operations support streaming
ExAws.S3.list_objects("my-bucket") |> ExAws.stream!() |> Enum.to_list()

AWS Key configuration

ExAws requires valid AWS keys in order to work properly. ExAws by default does the equivalent of:

config :ex_aws,
  access_key_id: [{:system, "AWS_ACCESS_KEY_ID"}, :instance_role],
  secret_access_key: [{:system, "AWS_SECRET_ACCESS_KEY"}, :instance_role]

This means it will try to resolve credentials in order:

  • Look for the AWS standard AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables
  • Resolve credentials with IAM
    • If running inside ECS and a task role has been assigned it will use it
    • Otherwise it will fall back to the instance role

AWS CLI config files are supported, but require an additional dependency:

{:configparser_ex, "~> 4.0"}

You can then add {:awscli, "profile_name", timeout} to the above config and it will pull information from ~/.aws/config and ~/.aws/credentials

Alternatively, if you already have a profile name set in the AWS_PROFILE environment variable, you can use that with {:awscli, :system, timeout}

config :ex_aws,
  access_key_id: [{:system, "AWS_ACCESS_KEY_ID"}, {:awscli, "default", 30}, :instance_role],
  secret_access_key: [{:system, "AWS_SECRET_ACCESS_KEY"}, {:awscli, "default", 30}, :instance_role]

For role based authentication via role_arn and source_profile an additional dependency is required:

{:ex_aws_sts, "~> 2.0"}

Further information on role based authentication is provided in said dependency.

Session token configuration

Alternatively, you can also provide AWS_SESSION_TOKEN to security_token to authenticate with session token:

config :ex_aws,
  access_key_id: {:system, "AWS_ACCESS_KEY_ID"},
  security_token: {:system, "AWS_SESSION_TOKEN"},
  secret_access_key: {:system, "AWS_SECRET_ACCESS_KEY"}

Hackney configuration

ExAws by default uses hackney to make HTTP requests to AWS API. You can modify the options as such:

config :ex_aws, :hackney_opts,
  follow_redirect: true,
  recv_timeout: 30_000

AWS Region Configuration.

You can set the region used by default for requests.

config :ex_aws,
  region: "us-west-2",

Alternatively, the region can be set in an environment variable:

config :ex_aws,
  region: {:system, "AWS_REGION"}

JSON Codec Configuration

The default JSON codec is Jason. You can choose a different one:

config :ex_aws,
  json_codec: Poison

Path Normalization

Paths that include multiple consecutive /'s will by default be normalized to a single slash. There are cases when paths need to be literal (S3) and this normalization behaviour can be turned off via configuration:

config :ex_aws,
  normalize_path: false

Direct Usage

ExAws can also be used directly without any specific service module.

You need to figure out how the API of the specific AWS service works, in particular:

  • Protocol (JSON or query).
  • Path (depends on the service and the specific operation, usually "/").
  • Service name (used to generate the request signature, as described here).
  • Request body, query params, HTTP method, and headers (depends on the service and specific operation).

You can look for this information in the service's API reference at docs.aws.amazon.com or, for example, in the Go SDK API models at github.com/aws/aws-sdk-go (look for a api-*.json file).

The protocol dictates which operation module to use for the request. If the protocol is JSON, use ExAws.Operation.JSON, if it's query, use ExAws.Operation.Query.

Examples

Redshift DescribeClusters

action = :describe_clusters
action_string = action |> Atom.to_string |> Macro.camelize

operation =
  %ExAws.Operation.Query{
    path: "/",
    params: %{"Action" => action_string},
    service: :redshift,
    action: action
  }

ExAws.request(operation)

ECS RunTask

data = %{
  taskDefinition: "hello_world",
  launchType: "FARGATE",
  networkConfiguration: %{
    awsvpcConfiguration: %{
      subnets: ["subnet-1a2b3c4d", "subnet-4d3c2b1a"],
      securityGroups: ["sg-1a2b3c4d"],
      assignPublicIp: "ENABLED"
    }
  }
}

operation =
  %ExAws.Operation.JSON{
    http_method: :post,
    headers: [
      {"x-amz-target", "AmazonEC2ContainerServiceV20141113.RunTask"},
      {"content-type", "application/x-amz-json-1.1"}
    ],
    path: "/",
    data: data,
    service: :ecs
}

ExAws.request(operation)

Highlighted Features

  • Easy configuration.
  • Minimal dependencies. Choose your favorite JSON codec and HTTP client.
  • Elixir streams to automatically retrieve paginated resources.
  • Elixir protocols allow easy customization of Dynamo encoding / decoding.
  • Simple. ExAws aims to provide a clear and consistent elixir wrapping around AWS APIs, not abstract them away entirely. For every action in a given AWS API there is a corresponding function within the appropriate module. Higher level abstractions like the aforementioned streams are in addition to and not instead of basic API calls.

That's it!

Retries

ExAws will retry failed AWS API requests using exponential backoff per the "Full Jitter" formula described in https://www.awsarchitectureblog.com/2015/03/backoff.html

The algorithm uses three values, which are configurable:

# default values shown below

config :ex_aws, :retries,
  max_attempts: 10,
  base_backoff_in_ms: 10,
  max_backoff_in_ms: 10_000
  • max_attempts is the maximum number of possible attempts with backoffs in between each one
  • base_backoff_in_ms corresponds to the base value described in the blog post
  • max_backoff_in_ms corresponds to the cap value described in the blog post

Testing

If you want to run mix test, you'll need to have a local dynamodb running on port 8000:

docker run --rm -d -p 8000:8000 amazon/dynamodb-local -jar DynamoDBLocal.jar -port 8000

For more info please see Setting up DynamoDB Local.

The redirect test will intentionally cause a warning to be issued.

License

The MIT License (MIT)

Copyright (c) 2014-2020 CargoSense, Inc.

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

ex_aws's People

Contributors

aaronrenner avatar adamkittelson avatar aseigo avatar benwilson512 avatar bernardd avatar bruce avatar cjbottaro avatar claudiostahl avatar dependabot-preview[bot] avatar dependabot[bot] avatar ericmj avatar firx avatar jacobsmith avatar james-gibson-fr avatar joeybaer avatar jordi-chacon avatar kanmo avatar kianmeng avatar kyleaa avatar lessless avatar mmzx avatar prato avatar rjenkins2 avatar ruslandoga avatar scohen avatar tattdcodemonkey avatar vasilecimpean avatar visciang avatar wkirschbaum avatar zombieharvester avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ex_aws's Issues

Very poor doc experience in iex

I'm used to poking around in iex using h to figure out library APIs. Your hexdocs.pm docs are pretty good but in iex the docs are awful:

screen shot 2015-07-10 at 10 02 07 am

I'm able to look at the source and hexdocs.pm to figure things out, but it would be great if you had usable docs available in iex. Actually, the reason I was even trying erlcloud was because I first tried ExAws, noticed the lack of iex docs and took that as a sign that ExAws is just a prototype so far, and that I should therefore look elsewhere for an S3 client. I guess I'm spoiled by many other elixir libraries having good docs available in iex :).

Dynamo Key requests with Ranges

Dynamo queries that take a primary key need to support more than one, because you can have primary keys based on hashes and ranges.

security header for s3 when using temp credentials

Hello,

When using temp ec2 credentials based on an IAM role, you must pass a x-amz-security-token header when making requests to S3.

The token value is just the token value from the instance metadata.

More info at: http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html

Currently, all S3 requests using the temp credentials result in: The AWS Access Key Id you provided does not exist in our records.

This should go away once the header is added in.

range key for dynamodb

doesn't look like RANGE key is supported in key_schema for create_table. is this intentional?

Dynamo.Encodable protocol not implemented

Hi Ben!

I am using version 0.4.8.

I am getting the following error when putting an item into Dynamo:

** (exit) an exception was raised:
    ** (Protocol.UndefinedError) protocol ExAws.Dynamo.Encodable not implemented for %Service.Operation{oid: "1442-696618-318045", type: "2"}
        (ex_aws) lib/ex_aws/dynamo/encodable.ex:1: ExAws.Dynamo.Encodable.impl_for!/1
        (ex_aws) lib/ex_aws/dynamo/encodable.ex:6: ExAws.Dynamo.Encodable.encode/1
        (ex_aws) lib/ex_aws/dynamo/impl.ex:147: ExAws.Dynamo.Impl.put_item/4

Since I have declared the Encodable protocol in the Operation module I am really confused by this error. Here is a simplified version of the Operation module, which still causes the exception:

defmodule Service.Operation do
  alias __MODULE__

  @derive [ExAws.Dynamo.Encodable]
  defstruct [:oid, :type]

  def setup_store do
    ExAws.Dynamo.create_table("Operations", "oid", %{oid: :string}, 1, 1)
  end

  def new(oid, _, _, _) do
    %Operation{type: "2",
               oid: oid
              }
  end

  def write(operation) do
    ExAws.Dynamo.put_item("Operations", operation)
  end
end

Obviously I must be doing something wrong but I honestly can't see it... I'd appreciate some help.

Thank you again Ben!

moar envs!

i'd like to use MIX_ENV=gamma or others. It currently fails with

 21) test sort works (OuteTest)
     test/backends_test.exs:125
     ** (FunctionClauseError) no function clause matching in ExAws.Config.Defaul
ts.defaults/1
     stacktrace:
       lib/ex_aws/config/defaults.ex:10: ExAws.Config.Defaults.defaults(:gamma)
       lib/ex_aws/config/defaults.ex:4: ExAws.Config.Defaults.defaults/0
       lib/ex_aws/config.ex:26: ExAws.Config.get/1
       lib/ex_aws/config.ex:14: ExAws.Config.build/2
       lib/ex_aws/dynamo.ex:2: ExAws.Dynamo.delete_table/1
       test/backends_test.exs:12: OuteTest.__ex_unit_setup_0/1
       test/backends_test.exs:1: OuteTest.__ex_unit__/2

working test and or Readme documenting stream_query query get etc

would be useful to see example tests for query and stream_query for ppl not familiar with dynamodb query expressions. also might want to clarify position on queryfilter and if it is not going to be supported.

i could not get the following to work

user = %User{admin: false, age: 23, email: "[email protected]", name: "Bubba"}
Dynamo.scan("Users",
      limit: 12,
      exclusive_start_key: [email: "email"],
      expression_attribute_names: [email: "#email"],
      expression_attribute_values: [email: "[email protected]"],
      filter_expression: "#email = :email")

{:error,
 {:http_error, 400,
  "{\"__type\":\"com.amazon.coral.validate#ValidationException\",\"Message\":\"ExpressionAttributeNames contains invalid key: Syntax error; key: \\\"email\\\"\"}"}}

list in map can't decode

it stored ok, but can't decode

iex(29)> ExAws.Dynamo.get_item("Graph-dev",%{id: "3",r: "a"})
{:ok,
 %{"Item" => %{"created_at" => %{"N" => "1431134399.66159391403"},
     "id" => %{"S" => "3"}, "lst" => %{"NS" => ["4", "5", "6"]},
     "r" => %{"S" => "a"}, "that" => %{"N" => "321"},
     "this" => %{"S" => "this"}}}}

iex(30)> ExAws.Dynamo.get_item!("Graph-dev",%{id: "3",r: "a"}) |> ExAws.Dynamo.Decoder.decode
** (FunctionClauseError) no function clause matching in ExAws.Dynamo.Decoder.decode/1
             lib/ex_aws/dynamo/decoder.ex:34: ExAws.Dynamo.Decoder.decode(["4", "5", "6"])
             lib/ex_aws/dynamo/decoder.ex:47: anonymous fn/2 in ExAws.Dynamo.Decoder.decode/1
    (stdlib) lists.erl:1261: :lists.foldl/3
             lib/ex_aws/dynamo/decoder.ex:47: anonymous fn/2 in ExAws.Dynamo.Decoder.decode/1
    (stdlib) lists.erl:1261: :lists.foldl/3

Dynamo conversion protocol

Instead of using guard clauses, dynamo should have dynamization protocol that is implemented for various data types.

sweet_xml not optional

== Compilation error on file lib/ex_aws/s3/parsers.ex ==
** (CompileError) lib/ex_aws/s3/parsers.ex:3: module SweetXml is not loaded and could not be found
    (elixir) expanding macro: Kernel.if/2
    lib/ex_aws/s3/parsers.ex:2: ExAws.S3.Parsers (module)
    (elixir) lib/kernel/parallel_compiler.ex:100: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/8

In addition, the sweet_xml dependency requires an older version:

Looking up alternatives for conflicting requirements on sweet_xml
  Activated version: 0.5.0
  From mix.exs: ~> 0.5.0
  From ex_aws v0.4.11: ~> 0.2.1

It would be nice to either make this truly optional or at least upgrade the dep to the latest version.

Can't configure ExAws to use Dynamodb Local anymore

Hi Ben,

I was using Dynamodb Local via ExAws just fine until this commit:d68ae5f

I have host: "localhost" which seems to cause the following crash:

    ** (EXIT) an exception was raised:
        ** (ArgumentError) argument error
            (stdlib) binary.erl:317: :binary.replace/4
            (ex_aws) lib/ex_aws/config.ex:67: ExAws.Config.parse_host_for_region/1
            (ex_aws) lib/ex_aws/config.ex:16: ExAws.Config.build/2

Is it still possible to use DynamoDB Local via ExAws? If so, how? If not, could you consider adding that option back?

Thanks!

Safely change AWS credentials per request

I'm looking for a way to safely change AWS credentials for each requests, using the request I determine which account and bucket to use.

Thanks for ExAws and your help! 👍

Poison version in conflict with Phoenix

Hello,

What would be the best way to fix this conflicting version issue?

Looking up alternatives for conflicting requirements on poison
  From ex_aws v0.4.9: ~> 1.2.0
  From phoenix v1.0.2: ~> 1.3
** (Mix) Hex dependency resolution failed, relax the version requirements or unlock dependencies

Thanks!

Follow S3 redirects for regions

I can't get any request to bucket on regions other than "US Standard" to work.

iex(4)> ExAws.S3.list_objects "bucket"
Request URL: "https://bucket.s3.amazonaws.com/"
...
** (CaseClauseError) no case clause matching: {:ok, %HTTPoison.Response{body: "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>TemporaryRedirect</Code><Message>Please re-send this request to the specified temporary endpoint. Continue to use the original request endpoint for future requests.</Message><Bucket>bucket</Bucket><Endpoint>bucket.s3-eu-west-1.amazonaws.com</Endpoint><RequestId>...</RequestId><HostId>...</HostId></Error>", headers: [{"x-amz-bucket-region", "eu-west-1"}, {"x-amz-request-id", "..."}, {"x-amz-id-2", "..."}, {"Location", "https://shim-wtf.s3-us-west-2.amazonaws.com/"}, {"Content-Type", "application/xml"}, {"Transfer-Encoding", "chunked"}, {"Date", "Tue, 08 Sep 2015 13:04:34 GMT"}, {"Server", "AmazonS3"}], status_code: 307}}
    (ex_aws) lib/ex_aws/request.ex:33: ExAws.Request.request_and_retry/7
    (ex_aws) lib/ex_aws/s3/impl.ex:52: ExAws.S3.Impl.list_objects/3

Configuring s3.region doesn't make any difference...

scan from tests not working

I get the following when running the scan from your dynamodb test

iex(27)> Dynamo.scan("Users",
...(27)>       limit: 12,
...(27)>       exclusive_start_key: [api_key: "api_key"],
...(27)>       expression_attribute_names: [api_key: "#api_key"],
...(27)>       expression_attribute_values: [api_key: "asdfasdfasdf", name: "bubba"],
...(27)>       filter_expression: "ApiKey = #api_key and Name = :name")
{:error,
 {:http_error, 400,
  "{\"__type\":\"com.amazon.coral.validate#ValidationException\",\"Message\":\"ExpressionAttributeNames contains invalid key: Syntax error; key: \\\"api_key\\\"\"}"}}

Lazy functions

The lazy functions could be lazier. The issue is that with Stream functions like unfold/2 and resource calculate the N+1th value at the same time that you retrieve the Nth value. For example

  defp build_scan_stream(initial, request_fun) do
    Stream.resource(fn -> initial end, fn
      :quit -> {:halt, nil}

      {:error, items} -> {[{:error, items}], :quit}

      {:ok, %{"Items" => items, "LastEvaluatedKey" => key}} ->
        {items, request_fun.(%{ExclusiveStartKey: key})}

      {:ok, %{"Items" => items}} ->
        {items, :quit}
    end, &pass/1)
  end

request_fun.(%{ExclusiveStartKey: key}) is called at the time the initial item set is added to the accumulator. Suppose initial looks like {:ok, %{"Items" => [1,2,3], "LastEvaluatedKey" => 3}}. This pattern matches to the {:ok, %{"Items" => items, "LastEvaluatedKey" => key}} -> clause, which means that request_fun is called EVEN if you only call Enum.take(1) on the stream. What would be desirable is if there was some way to have request_fun not called until it was actually needed.

SweetXML as hard dependency on 0.4.10?

Hi Ben,

just updated ex_aws dependency on my project from 0.4.8 to 0.4.10 and now get this error on compilation:

== Compilation error on file lib/ex_aws/s3/parsers.ex ==
** (CompileError) lib/ex_aws/s3/parsers.ex:3: module SweetXml is not loaded and could not be found
    (elixir) expanding macro: Kernel.if/2
    lib/ex_aws/s3/parsers.ex:2: ExAws.S3.Parsers (module)
    (elixir) lib/kernel/parallel_compiler.ex:97: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/8

could not compile dependency ex_aws, mix compile failed. You can recompile this dependency with `mix deps.compile ex_aws` or update it with `mix deps.update ex_aws

Has SweetXML become a mandatory dependency?

Thanks!

get_item! empty result

wondering if {:ok, %{}} is correct on an empty result. {:ok,nil} may be better. maybe not for get_item!, but get_item should return an empty map if you expect people to use case

this smells a bit

case Dynamo.get_item!(..) do
  {:ok,map} when map == %{} -> ....
  #  {:ok,%{}} always matches so we can't use it with case...

Can't create an S3 client module using `otp_app` option

Based on these docs, I expected the following to work:

# lib/delorean/s3.ex
defmodule Delorean.S3 do
  use ExAws.S3.Client, opt_app: :delorean
end
# config/config.exs
use Mix.config

config :delorean, ExAws,
  s3: [region: "us-west-2", scheme: "https://", host: "s3.amazonaws.com"]

import_config "s3_creds.exs"
# config/s3_creds.exs
use Mix.Config

config :delorean, ExAws,
  access_key_id: "redacted",
  secret_access_key: "redacted"

However, I get an error:

     ** (RuntimeError) A valid configuration root is required in your s3 client
     stacktrace:
       (ex_aws) lib/ex_aws/config.ex:23: ExAws.Config.get/1
       (ex_aws) lib/ex_aws/config.ex:14: ExAws.Config.build/2
       (delorean) lib/delorean/s3.ex:2: Delorean.S3.list_objects!/2

I can get it to work by changing lib/delorean/s3.ex to:

defmodule Delorean.S3 do
  use ExAws.S3.Client

  def config_root do
    Application.get_all_env(:delorean)[ExAws]
  end
end

...but that doesn't seem necessary. What am I doing wrong? Or are your docs inaccurate?

Exception on S3 delete_object

** (Protocol.UndefinedError) protocol Enumerable not implemented for "connection/1.0.2/css/elixir.css"
(elixir) lib/enum.ex:1: Enumerable.impl_for!/1
(elixir) lib/enum.ex:112: Enumerable.reduce/3
(elixir) lib/enum.ex:1398: Enum.reduce/3
(ex_aws) lib/ex_aws/s3/impl.ex:201: ExAws.S3.Impl.delete_object/4
(ex_aws) lib/ex_aws/s3/impl.ex:204: ExAws.S3.Impl.delete_object!/4
(elixir) lib/enum.ex:585: anonymous fn/3 in Enum.each/2
(elixir) lib/enum.ex:1390: anonymous fn/3 in Enum.reduce/3
(elixir) lib/stream.ex:1220: anonymous fn/3 in Enumerable.Stream.reduce/3
(elixir) lib/enum.ex:2607: Enumerable.List.reduce/3
(elixir) lib/stream.ex:1119: Stream.do_list_resource/6
(elixir) lib/stream.ex:1240: Enumerable.Stream.do_each/4
(elixir) lib/enum.ex:1389: Enum.reduce/3
(elixir) lib/enum.ex:584: Enum.each/2
(hex_web) lib/hex_web/api/handlers/docs.ex:120: HexWeb.API.Handlers.Docs.upload_docs/3
(elixir) lib/task/supervised.ex:74: Task.Supervised.do_apply/2
(stdlib) proc_lib.erl:240: :proc_lib.init_p_do_apply/

Happens on commit 649d36c.

Dynamo decorder decodes value "TRUE" of type string to :true atom

Currently, the dynamo decoder is setup to decode string data types with a value of "TRUE" to the atom true upon deserialization.

That absolutely makes sense when the dynamo type is boolean but I'm not so sure if it makes sense when the dynamo data type is string. In the case of a string data type, it seems like more of a bandaid for those that are not using the proper types when the better solution would be to simply fix their types.

I'm dealing with stock market and "TRUE" is a valid ticker symbol.

The leads to strange behavior since, I cannot do this...

# Retrieve metadata
{:ok, %{"Item" => item}} = ExAws.Dynamo.get_item(table(), %{"symbol" => symbol})
metadata = ExAws.Dynamo.Decoder.decode(item, as: Metadata)

# Update metadata and save it back
new_metadata = update_metadata(metadata, new_data)
ExAws.Dynamo.put_item table(), new_metadata

Because the decode transforms the "TRUE" string into an atom, the put_item blows up with a validation exception since I'm now trying to save a boolean value into a key field which is a string field:

{:error, {"ValidationException", "One or more parameter values were invalid: Type mismatch for key symbol expected: S actual: BOOL"}}

It seems like it makes sense to remove these two cases from the decode function:

  def decode(%{"S" => "TRUE"}),     do: true
  def decode(%{"S" => "FALSE"}),    do: false

What do you think?

streamlining

seems like having to pipe to decode is a bit terse, would you consider some short cuts to return a decoded item or list? something

get_map(table_name,primary_key) :: {Atom, Map}
query_maps( table_name,primary_key) :: {Atom,List}

you could also do

get_struct(table_name,primary_key,struct_name)
query_structs(...)

also, it is unclear how you use a primary_key and range_key with get_item

finally wondering if there is a way to annotate the primary key, range key and even table_name in the module. it would be nice to derive this for things like

defmodule User do
  @table_name "Users"
  @primary_key "email"
  @range_key "age"
  @derive [ExAws.Dynamo.Encodable]
  defstruct [:email, :name, :age, :admin]
end


user = %User{admin: false, age: 23, email: "[email protected]", name: "Bubba"}
put_item(user)
new_user = %User{admin: true, age: 33, email: "[email protected]", name: "Unf"}
update_item(new_user)

Uploads to S3 failing

This is going to be a very unsatisfactory issue report, I apologise in advance.

This was reported by @ericmj and experienced by myself in one of our projects.

It seems that sometimes uploads to S3 stop working. No error is being raised and it looks like they just stall. Data doesn't make it into S3 but again: No error.

Best guess right now is that requests are timing out and the pool only holds one connection.

Possible fixes:

  • Timeouts (e.g. recv_timeout, pool timeout, etc.)
  • Increase pool size

Bang functions

In Dynamo and other APIs, there is a clear value to having ! function corollaries to a lot of regular functions. The question is, for something like get_item! should it unwrap {:ok, result} to just result or {:ok, %{"Item" => item} to item. It's worth keeping in mind that if we do the latter it invalidates certain options like :return_consumed_capacity which add an attribute to the result map.

Basically the question boils down to what the ! asserts in something like scan! query! get_records! etc. Are we asserting that the requests succeeds in an HTTP sense or are we trying to have it do more? I'm presently inclined towards the former option, but will think it over more.

DynamoDB: Float encoding

The float encoding is using Float.to_string/1 which means that the results stored in Dynamo can be different of the ones in the source map. IMO, the encoding shouldn't implicitly transform the values to encode.

e.g.
1.67 will be stored as "1.66999999999999992895e+00" instead of 1.67

A simple solution would be using to_string/1 instead of Float.to_string/1. However the to_string/1 is much slower that the Float.to_string/1.

Another solution is to be able to set the Float.to_string/1 options.

client bang function fail on error with no clear exception

The S3.Client list_objects!, get_object! functions do not throw meaningful exception (as expected) on error, generating MatchError instead.

defmodule ExAwsTest do
  use ExUnit.Case
  use ExAws.S3.Client, otp_app: Delorean.RankingsBuilder

  test "abc" do
    __MODULE__.list_objects!("foo", prefix: "bar/baz")
    assert false
  end
end

generates something like:

  1) test abc (ExAwsTest)
     test/unit/ex_aws_test.exs:5
     ** (MatchError) no match of right hand side value: {:error, {:http_error, 400, "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AuthorizationHeaderMalformed</Code><Message>The authorization header is malformed; the region 'us-west-2' is wrong; expecting 'us-east-1'</Message><Region>us-east-1</Region><RequestId>C7DE1C18F8D90136</RequestId><HostId>x5m/0zmS4VlrUAzM/igI5iWwddeUYq7vozXq/x2+bmWDW6804WzntLsZ5e8m/Mqz</HostId></Error>"}}
     stacktrace:
       (ex_aws) lib/ex_aws/s3/impl.ex:57: ExAws.S3.Impl.list_objects!/3
       test/unit/ex_aws_test.exs:6

Quoting Dave Thomas, Programming Elixir, p. 129: """The trailing exclamation point in the method name is an Elixir convention—if you see it, you know the function will raise an exception on error, and that exception will be meaningful.""" Sorry, It is not LRM, but I do not think Elixir has LRM.

All DynamoDB requests time out

Hi!

First of all, thank you so much for putting so much time and dedication into this library. I truly appreciate it! I think it's awesome that you made it possible to get the credentials via the EC2 role!

I am facing some problems when running my Elixir application from within an EC2 instance inside an ECS container. All requests time out.

This is my ex_aws config:

[dynamodb: [
  scheme: {:system, "DYNAMODB_SCHEME"},
  host: {:system, "DYNAMODB_HOST"},
  port: {:system, "DYNAMODB_PORT"},
  region: "eu-west-1"
],
included_applications: [],
access_key_id: [{:system, "AWS_ACCESS_KEY_ID"}, :instance_role],
secret_access_key: [{:system, "AWS_SECRET_ACCESS_KEY"}, :instance_role]]

The last two OS env variables are undefined, so the :instance_role is used.

I have put an IO.inspect on ExAws.Request.request/5 to better understand what might be the problem:

Body of the request:

%{"AttributeDefinitions" => [%{"AttributeName" => :uid,
     "AttributeType" => "S"}],
  "KeySchema" => [%{"AttributeName" => "uid", "KeyType" => "HASH"}],
  "ProvisionedThroughput" => %{"ReadCapacityUnits" => 1,
    "WriteCapacityUnits" => 1}, "TableName" => "facebook_service_users"}

Headers:

[{"Authorization",
  "AWS4-HMAC-SHA256 Credential=/20150626/eu-west-1/dynamodb/aws4_request,SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target,Signature=668....f"},
 {"host", "dynamodb.eu-west-1.amazonaws.com"},
 {"x-amz-date", "20150626T155125Z"},
 {"x-amz-target", "DynamoDB_20120810.CreateTable"},
 {"content-type", "application/x-amz-json-1.0"}, {"x-amz-content-sha256", ""}]

URL:

https://dynamodb.eu-west-1.amazonaws.com:80/

This is the error I get in the logs:

ExAws: HTTP ERROR: :connect_timeout

Just to rule out that the problem is not in the IAM side, I have tried to create a dynamodb table from the EC2 instances via awscli and that worked just fine. Therefore, the problem is at the application level.

I am running my application with MIX_ENV=test.

I would appreciate it if you would share with me any hints on what could be the problem.

Thank you a lot!

Timeout error when using along with arc

I updated arc recently, and moved from ercloud to ex_aws, now the image upload works with the local storage but could not upload it to s3 using ex_aws, it throws a timeout error .

screenshot from 2016-01-06 13-10-53_censored

I tried with the latest version of ex_aws and even tried with current master branch.
I previously had HTTPoison of version 0.8, so tried with 0.7.4.
Tried with a region and without it. The error still exists. What am I doing wrong, could some one help me?

# config.exs

config :ex_aws
  region: "eu-west-1",
    access_key_id: [{:system, "MY_AWS_ACCESS_KEY_ID"}, :instance_role],
  secret_access_key: [{:system, "MY_AWS_SECRET_ACCESS_KEY"}, :instance_role]

Add erlang friendly clients

While I don't want to do tons of this, there's interest in adding erlang friendly clients so that erlang users can use this instead of erlcloud. Right now erlang users have to do 'Elixir.ExAws.S3':list_objects which is awkward.

The following gives an example s3 client.

defmodule :ex_aws_s3 do
  use ExAws.S3.Client
end

decode lists from a non-stream scan or query and/or other lists

https://github.com/CargoSense/ex_aws/blob/master/lib/ex_aws/dynamo/decoder.ex#L8

i think you mean

items |> Stream.map(&decode(&1,as: struct_module))

regardless, you have any thoughts on the "correct" way to pipe results from a scan?

also this is a good case for scan!, i saw your notes from stream_scan, but scan could still use this. It would require Decoder knows about "Item" vs "Items"

#my vote
Dynamo.scan!(@t_name) |> Dynamo.Decoder.decode(as: Foo)

# vs

Dynamo.scan!(@t_name)["Items"] |> Dynamo.Decoder.decode(as: Foo)

# vs.
{:ok, stuff} = Dynamo.scan(@t_name)
Dynamo.Decoder...

Usage with OTP releases

Hi, thanks for making this library!

I'm looking into using it in our evercam-media application and everything worked nicely in development but I'm having problems making it run in production environment. We use OTP releases generated with exrm and that seems to conflict with assumptions that ex_aws makes. I first thought that the problem was just that mix wasn't included in the release, but adding it uncovered another issue.

This is the error I'm getting:

[error] %ArgumentError{message: "argument error"}
[error]     (stdlib) :ets.lookup(Mix.State, :env)
    (mix) lib/mix/state.ex:33: Mix.State.fetch/1
    (mix) lib/mix/state.ex:42: Mix.State.get/2
    (ex_aws) lib/ex_aws/config/defaults.ex:3: ExAws.Config.Defaults.defaults/0
    (ex_aws) lib/ex_aws/config.ex:26: ExAws.Config.get/1
    (ex_aws) lib/ex_aws/config.ex:14: ExAws.Config.build/2
    (ex_aws) lib/ex_aws/s3.ex:2: ExAws.S3.put_object!/3
    (evercam_media) lib/snapshots/snapshot_fetch.ex:69: EvercamMedia.Snapshot.store/4

You can see the code I'm using at: https://github.com/evercam/evercam-media/pull/42/files

Any suggestions, am I doing something wrong?

Trouble with expressions

This is most likely a mistake on my part, so on that note, is there an easy way to see the generated Dynamo request for debugging purposes?

Let's say I have an Item in a table called Notifications:

{:ok, n} = Dynamo.get_item("Notifications", %{ id: "1234" });                                                                                                                                    
=> {:ok,
 %{"Item" => %{"attempts" => %{"N" => "0"}, "id" => %{"S" => "1234"},
     "max_attempts" => %{"N" => "20"}, "mock" => %{"BOOL" => false},
     "request" => %{"M" => %{}}, "response" => %{"M" => %{}},
     "state" => %{"S" => "pending"}}}}

I then want to update that item, and since I'm updating an older application to use expressions, I'll also need to use expression attribute names for some of these keys with reserved names.

Dynamo.update_item("Notifications", %{id: n.id}, %{ return_values: "ALL_NEW", expression_attribute_names: %{ "#s" => "state" }, update_expression: "SET #s = foobar" })
=> {:error,
 {"ValidationException",
  "The provided expression refers to an attribute that does not exist in the item"}}

Again, I'm sure this is something wrong on my end. I could always stub out the Dynamo methods and see what's being passed in, but a baked in way to get the generated request body would be really helpful in situations like this.

Edit: To make things more frustrating, things seem to be working fine with good ol' AttributeUpdates.

Dynamo.update_item(table, %{id: "1234"}, %{return_values: "ALL_NEW", attribute_updates: %{ state: %{Action: "PUT", Value: %{S: "queued"}}})
=> {:ok,
 %{"Attributes" => %{"attempts" => %{"N" => "0"}, "id" => %{"S" => "1234"},
     "max_attempts" => %{"N" => "20"}, "mock" => %{"BOOL" => false},
     "request" => %{"M" => %{}}, "response" => %{"M" => %{}},
     "state" => %{"S" => "queued"}}}}

Add Travis CI

Add test configuration for Travis CI. See, eg, CargoSense/vex#14.

Obviously tests that depend on the network and/or local DynamoDB, etc, should not be run by Travis CI, so more configuration may be necessary.

list_objects expects GetObjectAcl permission

Unless s3:GetObjectAcl action is allowed, the Content entries of the response xml will not contain this part, which seems to be expected by the parsing code:

<Owner><ID>...</ID><DisplayName>...</DisplayName></Owner>

leads to this crash:

** (Protocol.UndefinedError) protocol Enumerable not implemented for nil
    (elixir) lib/enum.ex:1: Enumerable.impl_for!/1
    (elixir) lib/enum.ex:112: Enumerable.reduce/3
    (elixir) lib/stream.ex:695: Stream.do_enum_transform/8
    (elixir) lib/stream.ex:647: Stream.do_transform/7
             lib/sweet_xml.ex:484: anonymous fn/4 in SweetXml.continuation_opts/2
             xmerl_scan.erl:565: :xmerl_scan.scan_document/2
             xmerl_scan.erl:288: :xmerl_scan.string/2
             lib/sweet_xml.ex:180: SweetXml.parse/2

Credentials from ec2 role - instance metadata

It would be awesome if this client worked with ec2 roles, pulling credentials from instance metadata. Even better would be to use the default credentials chain that the amazon-developed clients use :)

Per-Service request error handling

Different services have different conventions about how errors are returned. Right now client_error/2 exists in the root request and only handles dynamo exceptions, which is bad. client_error should exist inside the individual service request modules and handle stuff there.

Bang functions

The functions in dynamo and kinesis should all have bang options that simply return the value, or raise an exception.

Dynamo encoding for nested structs

If a struct is nested inside another struct, dynamo will not encode it as a map properly. While other maps when encoded return %{"M" => encoded_map} structs just return encoded_map so that they can be used directly in Dynamo queries.

Ideally, current usage of encode/1 for structs would be replaced by an encode_root/1 or encode(item, :root) function, and encode/1 would always return the %{"M" => encoded_map} form. How much of a breaking change this is needs to be evaluated.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.