GithubHelp home page GithubHelp logo

lukemathwalker / zero-to-production Goto Github PK

View Code? Open in Web Editor NEW
5.3K 69.0 461.0 1.01 MB

Code for "Zero To Production In Rust", a book on API development using Rust.

Home Page: https://www.zero2prod.com

License: Apache License 2.0

Rust 94.92% Shell 2.80% Dockerfile 0.93% PLpgSQL 0.39% HTML 0.96%
rust book

zero-to-production's Introduction

Zero To Production In Rust

Zero To Production In Rust is an opinionated introduction to backend development using Rust.

This repository serves as supplementary material for the book: it hosts several snapshots of the codebase for our email newsletter project as it evolves throughout the book.

Chapter snapshots

The main branch shows the project at the end of the book.

You can browse the project at the end of previous chapters by switching to their dedicated branches:

Pre-requisites

You'll need to install:

There are also some OS-specific requirements.

Windows

cargo install -f cargo-binutils
rustup component add llvm-tools-preview
cargo install --version="~0.7" sqlx-cli --no-default-features --features rustls,postgres

Linux

# Ubuntu 
sudo apt-get install lld clang libssl-dev postgresql-client
# Arch 
sudo pacman -S lld clang postgresql
cargo install --version="~0.7" sqlx-cli --no-default-features --features rustls,postgres

MacOS

brew install michaeleisel/zld/zld
cargo install --version="~0.7" sqlx-cli --no-default-features --features rustls,postgres

How to build

Launch a (migrated) Postgres database via Docker:

./scripts/init_db.sh

Launch a Redis instance via Docker:

./scripts/init_redis.sh

Launch cargo:

cargo build

You can now try with opening a browser on http://127.0.0.1:8000/login after having launch the web server with cargo run.

There is a default admin account with password everythinghastostartsomewhere. The available entrypoints are listed in src/startup.rs

How to test

Launch a (migrated) Postgres database via Docker:

./scripts/init_db.sh

Launch a Redis instance via Docker:

./scripts/init_redis.sh

Launch cargo:

cargo test 

zero-to-production's People

Contributors

azdle avatar dkulla01 avatar filmon-arefayne avatar guoard avatar hofer-julian avatar jeshansen avatar lukemathwalker avatar matclab avatar mdtro avatar moises-marquez avatar pedromfedricci avatar pickfire avatar williamhgough avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

zero-to-production's Issues

fail to compile due to sqlx-rt

Compiling sqlx-rt v0.2.0
error: one of the features ['runtime-actix-native-tls', 'runtime-async-std-native-tls', 'runtime-tokio-native-tls', 'runtime-actix-rustls', 'runtime-async-std-rustls', 'runtime-tokio-rustls'] must be enabled
  --> /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/sqlx-rt-0.2.0/src/lib.rs:9:1
   |
9  | / compile_error!(
10 | |     "one of the features ['runtime-actix-native-tls', 'runtime-async-std-native-tls', \
11 | |      'runtime-tokio-native-tls', 'runtime-actix-rustls', 'runtime-async-std-rustls', \
12 | |      'runtime-tokio-rustls'] must be enabled"
13 | | );
   | |__^

error: aborting due to previous error

error: could not compile `sqlx-rt`.

Amazing work on all of this so far! I'll be looking to add the "distroless" musl setup from https://dev.to/sergeyzenchenko/actix-web-in-docker-how-to-build-small-and-secure-images-2mjd soon!

Chapter 03 pt1 test naming and result editing

A couple minor points that only confuse those who geek out on details. ;-)

  1. Section 2.2. Capturing Our Requirements As Tests has a test named subscribe_returns_a_200_for_valid_form_data.
    This also matches the source in tests/health_check.rs.
    But the corresponding test result following the test source shown in the book has a test named subscribe_accepts_valid_form_data. There are four instances of this name in the text that should be changed to match the code.

  2. I presume the test result shown in the book is edited to remove a lot of distracting noise. This is good, it's also rather different from what I see when I run cargo test. A simple suggestion that might avoid a few moments confusion, Label the test result something like:

cargo test result (edited for clarity).

This is the rather noisy output I see from cargo test:

$ cargo test
    Finished test [unoptimized + debuginfo] target(s) in 0.29s
     Running target/debug/deps/zero2prod-0ab5d8eae8ffd38e

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out

     Running target/debug/deps/zero2prod-c6cfe04084471d4d

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out

     Running target/debug/deps/health_check-492d616831f44b24

running 3 tests
test health_check_works ... ok
test subscribe_returns_a_200_for_valid_form_data ... FAILED
test subscribe_returns_a_400_when_data_is_missing ... FAILED

failures:

---- subscribe_returns_a_200_for_valid_form_data stdout ----
thread 'subscribe_returns_a_200_for_valid_form_data' panicked at 'assertion failed: `(left == right)`
  left: `200`,
 right: `404`', tests/health_check.rs:57:5
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Panic in Arbiter thread.

---- subscribe_returns_a_400_when_data_is_missing stdout ----
thread 'subscribe_returns_a_400_when_data_is_missing' panicked at 'assertion failed: `(left == right)`
  left: `400`,
 right: `404`: The API did not fail with 400 Bad Request when the payload was missing the email.', tests/health_check.rs:82:9
Panic in Arbiter thread.


failures:
    subscribe_returns_a_200_for_valid_form_data
    subscribe_returns_a_400_when_data_is_missing

test result: FAILED. 1 passed; 2 failed; 0 ignored; 0 measured; 0 filtered out

error: test failed, to rerun pass '--test health_check'

PS. I'm getting a lot of value from the book, it's much appreciated. Keep up the great work! :-)

Chapter 3 part 1 - 3.7 edits

3.7. Updating Our Tests

  1. First paragraph following the first code snippet, change:

'...to perform our SELECT query, it makes to generalise a bit spawn_app: instead of returning a raw String,...'

to:

'...to perform our SELECT query, it makes sense to generalise spawn_app a bit: Instead of returning a raw String,...'

  1. Farther down in the section there's an extra word, change:

'The moment of the truth has finally come:...'

to:

'The moment of truth has finally come:...'

3.7.1. Test Isolation

  1. In the first code block, Uuid should be imported

use uuid::Uuid;

  1. In the last code block change:

use sqlx::Executor;

to:

use sqlx::{Connection, Executor, PgConnection, PgPool};

add:

use zero2prod::configuration::{get_configuration, DatabaseSettings};


I really enjoyed Chapter 3 - Got a lot out of it!

Unfortunately, I need to do some 'paying work' before I dive into chapter 4, so I may be absent for a while, I'm really anticipating the continued journey.

Configuration secrets and undo migrations

@LukeMathWalker, thanks for writing this great book!

I’ve got two questions:

  • What are your thoughts on undo migrations? Is it something that you plan to add? How are migration workflows usually dealt with in cloud-native environments, specially rollbacks? For example, in Kubernetes, I’ve used Helm hooks to run migration jobs when deploying but I haven’t found a good solution for rollbacks.
  • What’s the plan for configuration secrets like a database password? Will they just be stored in the repo encrypted or perhaps there will be an overrides files not checked into version control? How will this interact with the fact that there are multiple sources of truth for config values (i.e., .env and configuration.yaml)? Are there plans to get rid of these multiple sources of truth? It does feel a bit dirty/problematic.

Thanks!

RUSTSEC-2018-0019: Multiple memory safety issues

Multiple memory safety issues

Details
Package actix-web
Version 4.0.0-beta.3
URL actix/actix-web#289
Date 2018-06-08
Patched versions >=0.7.15

Affected versions contain multiple memory safety issues, such as:

  • Unsoundly coercing immutable references to mutable references
  • Unsoundly extending lifetimes of strings
  • Adding the Send marker trait to objects that cannot be safely sent between threads

This may result in a variety of memory corruption scenarios, most likely use-after-free.

A signficant refactoring effort has been conducted to resolve these issues.

See advisory page for additional details.

Chapter 3.5: For `spawn_app()` an `.await;` is required

Its been a long day so may have got some part wrong.... but am finding that the spawn_app() requires .await; on pg 32.

Without it:

$ cargo test --verbose --test healthy_check

running 1 test
test health_check_works ... FAILED

failures:

---- health_check_works stdout ----
thread 'health_check_works' panicked at 'Failed to execute request.: reqwest::Error { kind: Request, url: Url { scheme: "http", username: "", password: None, host: Some(Ipv4(127.0.0.1)), port: Some(8000), path: "/health_check", query: None, fragment: None }, source: hyper::Error(Connect, ConnectError("tcp connect error", Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }', tests/healthy_check.rs:18:10
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    health_check_works

test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.02s

error: test failed, to rerun pass '--test healthy_check'

The spawn_app is:

async fn spawn_app() {
    let server = zero2prod::run().expect("Failed to bind address");
    let _ = tokio::spawn(server);
}

Chapter 3.3.1: Graceful termination of actix webserver (blocking cargo run)

After curl ... returns `"Hello World!", would be nice to know how to properly cleanup the web server we have running.
Ctl+c seemed to work and I couldn't see any orphaned processes, but then again I didn't check to see what they might be called - my bad. But if there is a blessed shutdown mechanism, here is a good place to introduce it.

RUSTSEC-2020-0048: Use-after-free in BodyStream due to lack of pinning

Use-after-free in BodyStream due to lack of pinning

Details
Package actix-http
Version 1.0.1
URL actix/actix-web#1321
Date 2020-01-24
Patched versions >= 2.0.0-alpha.1

Affected versions of this crate did not require the buffer wrapped in BodyStream to be pinned,
but treated it as if it had a fixed location in memory. This may result in a use-after-free.

The flaw was corrected by making the trait MessageBody require Unpin
and making poll_next() function accept Pin<&mut Self> instead of &mut self.

See advisory page for additional details.

Errors while compiling chapter 4

Reproduce:

rustup install nightly
rustup default nightly
rustc --version
rustc 1.49.0-nightly (043eca7f0 2020-10-17)

git clone https://github.com/LukeMathWalker/zero-to-production
cd chapter04
cargo build

   Compiling proc-macro2 v1.0.24
   Compiling unicode-xid v0.2.1
   Compiling syn v1.0.45
   Compiling libc v0.2.79
   Compiling cfg-if v0.1.10
   Compiling autocfg v1.0.1
   Compiling lazy_static v1.4.0
   Compiling memchr v2.3.3
   Compiling log v0.4.11
   Compiling version_check v0.9.2
   Compiling slab v0.4.2
   Compiling futures-core v0.3.6
   Compiling proc-macro-hack v0.5.18
   Compiling pin-project-internal v0.4.27
   Compiling pin-project-lite v0.1.10
   Compiling futures-sink v0.3.6
   Compiling bytes v0.5.6
   Compiling fnv v1.0.7
   Compiling once_cell v1.4.1
   Compiling proc-macro-nested v0.1.6
   Compiling bitflags v1.2.1
   Compiling arc-swap v0.4.7
   Compiling pin-utils v0.1.0
   Compiling cc v1.0.61
   Compiling futures-io v0.3.6
   Compiling smallvec v1.4.2
   Compiling typenum v1.12.0
   Compiling getrandom v0.1.15
   Compiling scopeguard v1.1.0
   Compiling itoa v0.4.6
   Compiling tinyvec v0.3.4
   Compiling matches v0.1.8
   Compiling serde_derive v1.0.117
   Compiling regex-syntax v0.6.20
   Compiling byteorder v1.3.4
   Compiling serde v1.0.117
   Compiling percent-encoding v2.1.0
   Compiling ppv-lite86 v0.2.9
   Compiling linked-hash-map v0.5.3
   Compiling ryu v1.0.5
   Compiling either v1.6.1
   Compiling pkg-config v0.3.19
   Compiling unicode-segmentation v1.6.0
   Compiling adler v0.2.3
   Compiling opaque-debug v0.3.0
   Compiling copyless v0.1.5
   Compiling gimli v0.22.0
   Compiling cpuid-bool v0.1.2
   Compiling const_fn v0.4.2
   Compiling cfg-if v1.0.0
   Compiling object v0.21.1
   Compiling openssl v0.10.30
   Compiling rustc-demangle v0.1.17
   Compiling foreign-types-shared v0.1.1
   Compiling serde_json v1.0.59
   Compiling match_cfg v0.1.0
   Compiling maybe-uninit v2.0.0
   Compiling quick-error v1.2.3
   Compiling base64 v0.12.3
   Compiling crc32fast v1.2.0
   Compiling native-tls v0.2.4
   Compiling httparse v1.3.4
   Compiling hashbrown v0.9.1
   Compiling encoding_rs v0.8.24
   Compiling build_const v0.2.1
   Compiling openssl-probe v0.1.2
   Compiling subtle v2.3.0
   Compiling dtoa v0.4.6
   Compiling mime v0.3.16
   Compiling lexical-core v0.7.4
   Compiling language-tags v0.2.2
   Compiling ahash v0.3.8
   Compiling maplit v1.0.2
   Compiling hex v0.4.2
   Compiling static_assertions v1.1.0
   Compiling arrayvec v0.5.1
   Compiling whoami v0.9.0
   Compiling tinyvec_macros v0.1.0
   Compiling dotenv v0.15.0
   Compiling ansi_term v0.12.1
   Compiling instant v0.1.7
   Compiling thread_local v1.0.1
   Compiling tracing-core v0.1.17
   Compiling sharded-slab v0.0.9
   Compiling miniz_oxide v0.4.3
   Compiling num-traits v0.2.12
   Compiling num-integer v0.1.43
   Compiling crossbeam-utils v0.7.2
   Compiling indexmap v1.6.0
   Compiling hashbrown v0.8.2
   Compiling futures-channel v0.3.6
   Compiling generic-array v0.14.4
   Compiling standback v0.2.11
   Compiling time v0.2.22
   Compiling cookie v0.14.2
   Compiling nom v5.1.2
   Compiling futures-task v0.3.6
   Compiling bytestring v0.1.5
   Compiling lock_api v0.4.1
   Compiling http v0.2.1
   Compiling unicode-bidi v0.3.4
   Compiling unicode-normalization v0.1.13
   Compiling lru-cache v0.1.2
   Compiling yaml-rust v0.4.4
   Compiling brotli-sys v0.3.2
   Compiling heck v0.3.1
   Compiling openssl-sys v0.9.58
   Compiling foreign-types v0.3.2
   Compiling addr2line v0.13.0
   Compiling crc v1.8.1
   Compiling tinyvec v1.0.1
   Compiling idna v0.2.0
   Compiling stringprep v0.1.2
   Compiling quote v1.0.7
   Compiling iovec v0.1.4
   Compiling net2 v0.2.35
   Compiling num_cpus v1.13.0
   Compiling signal-hook-registry v1.2.1
   Compiling parking_lot_core v0.8.0
   Compiling hostname v0.3.1
   Compiling time v0.1.44
   Compiling socket2 v0.3.15
   Compiling gethostname v0.2.1
   Compiling tracing-log v0.1.1
   Compiling aho-corasick v0.7.14
   Compiling fxhash v0.2.1
   Compiling regex-automata v0.1.9
   Compiling url v2.1.1
   Compiling mio v0.6.22
   Compiling threadpool v1.8.1
   Compiling parking_lot v0.11.0
   Compiling rand_core v0.5.1
   Compiling resolv-conf v0.6.3
   Compiling backtrace v0.3.53
   Compiling flate2 v1.0.18
   Compiling atoi v0.3.2
   Compiling crossbeam-queue v0.2.3
   Compiling crossbeam-channel v0.4.4
   Compiling regex v1.4.1
   Compiling matchers v0.0.1
   Compiling digest v0.9.0
   Compiling block-buffer v0.9.0
   Compiling crypto-mac v0.8.0
   Compiling mio-uds v0.6.8
   Compiling rand_chacha v0.2.2
   Compiling brotli2 v0.3.2
   Compiling chrono v0.4.19
   Compiling sqlformat v0.1.1
   Compiling sha-1 v0.9.1
   Compiling sha2 v0.9.1
   Compiling md-5 v0.9.1
   Compiling hmac v0.8.1
   Compiling rand v0.7.3
   Compiling uuid v0.8.1
   Compiling tokio-macros v0.2.5
   Compiling futures-macro v0.3.6
   Compiling derive_more v0.99.11
   Compiling thiserror-impl v1.0.21
   Compiling actix-macros v0.1.2
   Compiling tracing-attributes v0.1.11
   Compiling enum-as-inner v0.3.3
   Compiling time-macros-impl v0.1.1
   Compiling async-trait v0.1.41
   Compiling actix-web-codegen v0.3.0
   Compiling tokio v0.2.22
   Compiling pin-project v0.4.27
   Compiling thiserror v1.0.21
   Compiling tracing v0.1.21
   Compiling time-macros v0.1.1
   Compiling actix-threadpool v0.3.3
   Compiling futures-util v0.3.6
   Compiling tracing-futures v0.2.4
   Compiling tokio-util v0.3.1
   Compiling tokio-native-tls v0.1.0
   Compiling serde_urlencoded v0.6.1
   Compiling actix-router v0.2.5
   Compiling tracing-serde v0.1.2
   Compiling config v0.10.1
   Compiling sqlx-rt v0.1.1
   Compiling actix-codec v0.3.0
   Compiling futures-executor v0.3.6
   Compiling actix-service v1.0.6
   Compiling actix-rt v1.1.1
   Compiling h2 v0.2.6
   Compiling tracing-subscriber v0.2.13
   Compiling sqlx-core v0.4.0-beta.1
   Compiling futures v0.3.6
   Compiling actix-utils v2.0.0
   Compiling trust-dns-proto v0.19.5
   Compiling tracing-bunyan-formatter v0.1.6
   Compiling actix-server v1.0.4
   Compiling actix-tls v2.0.0
   Compiling trust-dns-resolver v0.19.5
   Compiling actix-testing v1.0.1
   Compiling actix-connect v2.0.0
   Compiling sqlx-macros v0.4.0-beta.1
   Compiling actix-http v2.0.0
   Compiling awc v2.0.0
   Compiling actix-web v3.1.0
   Compiling sqlx v0.4.0-beta.1
   Compiling tracing-actix-web v0.2.1
   Compiling chapter04 v0.1.0 (/home/oren/p/rust/zero-to-production/chapter04)
error: error communicating with the server: Connection refused (os error 111)
   --> /home/oren/.cargo/registry/src/github.com-1ecc6299db9ec823/sqlx-0.4.0-beta.1/src/macros.rs:292:9
    |
278 | / macro_rules! query (
279 | |     // in Rust 1.45 we can now invoke proc macros in expression position
280 | |     ($query:expr) => ({
281 | |         $crate::sqlx_macros::expand_query!(source = $query)
...   |
292 | |         $crate::sqlx_macros::expand_query!(source = $query, args = [$($args)*])
    | |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ in this macro invocation (#2)
293 | |     })
294 | | );
    | |__- in this expansion of `sqlx::query!` (#1)
    | 
   ::: /home/oren/.cargo/registry/src/github.com-1ecc6299db9ec823/sqlx-macros-0.4.0-beta.1/src/lib.rs:24:1
    |
24  |   pub fn expand_query(input: TokenStream) -> TokenStream {
    |   ------------------------------------------------------ in this expansion of `$crate::sqlx_macros::expand_query!` (#2)
    | 
   ::: chapter04/src/routes/subscriptions.rs:35:5
    |
35  | /     sqlx::query!(
36  | |         r#"
37  | |     INSERT INTO subscriptions (id, email, name, subscribed_at)
38  | |     VALUES ($1, $2, $3, $4)
...   |
43  | |         Utc::now()
44  | |     )
    | |_____- in this macro invocation (#1)

warning: unused import: `chrono::Utc`
 --> chapter04/src/routes/subscriptions.rs:2:5
  |
2 | use chrono::Utc;
  |     ^^^^^^^^^^^
  |
  = note: `#[warn(unused_imports)]` on by default

warning: unused import: `uuid::Uuid`
 --> chapter04/src/routes/subscriptions.rs:4:5
  |
4 | use uuid::Uuid;
  |     ^^^^^^^^^^

error: aborting due to previous error; 2 warnings emitted

error: could not compile `chapter04`

To learn more, run the command again with --verbose.
warning: build failed, waiting for other jobs to finish...
error: build failed
cargo build --verbose

Output is here: http://paste.ubuntu.com/p/JvrRQh9c5d/

Chapter 3 part 1 minor ommission

In chapter 03 part 1 section 3.5.3 it would be helpful to have a notation in the code indicating the filename where the code is to be added:

//! src/configuration.rs <-- add this
...
impl DatabaseSettings {
...

PS. I'm really getting a lot out of Zero2Prod. This configuration feature has been on my list of things to put into my projects for some time, and here you just hand it to me complete. Lovin' it! 💯

Chapter 3.5, Page 31 has a typo

use zero2prod::run;
#[actix_rt::main]
async fn main() -> std::io::Result<()> {
// Bubble up the io::Error if we failed to bind the address
// Otherwise call .await on our Server
run()?.await
}

the above code snippet should be run().await instead of run()?.await

Sqlx 1 to many example

Hi @LukeMathWalker . I'm following along with your tutorial with my own use case but I can't seem to find a straight forward answer about sqlx support for One to Many deserialization.

I have a one to many relation between two tables:

CREATE TABLE consumable (
    id UUID NOT NULL DEFAULT uuid_generate_v1(),
    name VARCHAR(255),
    PRIMARY KEY (id)
);

CREATE TABLE serving (
    id SERIAL PRIMARY KEY,
    consumable_id UUID NOT NULL,
    amount DECIMAL(12,4) NOT NULL,
    kcal DECIMAL(12,4) NOT NULL,
    CONSTRAINT fk_consumable
          FOREIGN KEY(consumable_id)
    	  REFERENCES consumable(id)
    	  ON DELETE CASCADE
);

And these are the corresponding structs

#[derive(serde::Serialize, serde::Deserialize, sqlx::FromRow)]
#[sqlx(rename = "consumable")]
pub struct Consumable {
    pub id: Uuid,
    pub name: String,
    pub servings: Vec<Serving>
}

#[derive(sqlx::FromRow, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
#[sqlx(rename = "serving")]
pub struct Serving {
    pub id: String,
    pub amount: BigDecimal,
    pub kcal: BigDecimal
}

Do you know of a way to write a query that will directly write to an instance of Consumable. Is the code below something that is possible with sqlx, or would I have to query all the tables separately and then compose the structs programmatically?

pub async fn consumable_by_id(path: web::Path<String>, pool: web::Data<PgPool>) -> impl Responder {
    let  id = path.0;
    let consumable_or_error = sqlx::sqlx::query_as!(Consumable,
        r#"
        <SOME_QUERY_NOT_SURE_WHAT>
        WHERE consumable.id = $1
        "#,
        id
    )
    .fetch_one(connection.get_ref())
    .await;

    match consumable_or_error {
        Ok(consumable) => HttpResponse::Ok()
            .content_type("application/json")
            .json(consumable),
        Err(_) => HttpResponse::InternalServerError().finish(),
    }
}

Issue in 5.11 with tracing-bunyan-formatter-0.1.7

Hi Luke,

I'm hitting an issue with tracing-bunyan-formatter-0.1.7

{"v":0,"name":"z2p","msg":"Starting 2 workers","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:30.991252242+00:00","log.line":263,"log.target":"actix_server::builder","log.module_path":"actix_server::builder","log.file":"/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-server-1.0.4/src/builder.rs"}

{"v":0,"name":"z2p","msg":"Starting \"actix-web-service-0.0.0.0:8000\" service on 0.0.0.0:8000","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:30.991687848+00:00","log.module_path":"actix_server::builder","log.line":277,"log.file":"/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-server-1.0.4/src/builder.rs","log.target":"actix_server::builder"}
{"v":0,"name":"z2p","msg":"[ADDING A NEW SUBSCRIBER. - START]","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:39.087001836+00:00","target":"z2p::routes::subscriptions","line":21,"file":"src/routes/subscriptions.rs","email":"[email protected]","name":"Joey Dean4","request_id":"ba08c180-7945-4f7c-802c-178dd00f2760"}

{"v":0,"name":"z2p","msg":"[SAVING NEW SUBSCRIBER DETAILS IN THE DATABASE - START]","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:39.087462768+00:00","target":"z2p::routes::subscriptions","line":35,"file":"src/routes/subscriptions.rs","email":"[email protected]","name":"Joey Dean4","request_id":"ba08c180-7945-4f7c-802c-178dd00f2760"}

{"v":0,"name":"z2p","msg":"[ADDING A NEW SUBSCRIBER. - EVENT] /* SQLx ping */; rows: 0, elapsed: 587.002µs","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:39.088641273+00:00","log.target":"sqlx::query","log.module_path":"sqlx::query","email":"[email protected]","name":"Joey Dean4","request_id":"ba08c180-7945-4f7c-802c-178dd00f2760"}

{"v":0,"name":"z2p","msg":"[ADDING A NEW SUBSCRIBER. - EVENT] INSERT INTO subscriptions (id, …; rows: 0, elapsed: 2.325ms\n\nINSERT INTO\n  subscriptions (id, email, name, subscribed_at)\nVALUES\n($1, $2, $3, $4)\n","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:39.097324124+00:00","log.target":"sqlx::query","log.module_path":"sqlx::query","email":"[email protected]","name":"Joey Dean4","request_id":"ba08c180-7945-4f7c-802c-178dd00f2760"}

{"v":0,"name":"z2p","msg":"[ADDING A NEW SUBSCRIBER. - EVENT] request_id ba08c180-7945-4f7c-802c-178dd00f2760 - New subscriber details have been saved","level":30,"hostname":"li98-183","pid":3669213,"time":"2021-02-02T16:59:39.097601018+00:00","email":"[email protected]","name":"Joey Dean4","request_id":"ba08c180-7945-4f7c-802c-178dd00f2760"}

thread 'actix-rt:worker:0' panicked at 'Timestamp not found on 'record', this is a bug', /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tracing-bunyan-formatter-0.1.7/src/storage_layer.rs:157:18
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

The above happens when I curl subscribe to the running server, or I run cargo test. Prior to migrating to the tracing library the test ran with the log library.

Please see my attached repository https://github.com/cmdrtomalak/zero2production

Thanks.

Chapter 3 part 1 - 3.6 typos and edits

The vary last sentence in 3.6. Persisting A New Subscriber contains extra worlds:

"Let's see figure out how to get one."

Change to:

"Let's see how to get one."

Ch 5.3.8.2 Dockerfile rust image correction

In the 3rd build stage of the dockerfile, builder, the rust image is pinned to version 1.49. This causes the builder stage to re-compile all dependencies since lukemathwalker/cargo-chef image uses latest which is now at version 1.50.

6.12.2 - Missing SubscriberEmail Implementations in order to pass tests at end of section

If following the section 6.12.2 verbatim, you'll end up with a partial implementation that is missing a few things. The code will compile, but we won't pass the tests as the email address is still just a String in the subscribe([...]) function.

Here's what I updated to make sure the tests passed.

  • (./src/domain/new_subscriber.rs): Updating email in the NewSubscriber from String to SubscriberEmail (and the necessary import)

  • (./src/routes/subscription.rs): Updating subscribe([...])...

    • Need to add a line to parse form.0.email into SubscriberEmail
       let email = SubscriberEmail::parse(form.0.email)
       	.map_err(|_| HttpResponse::BadRequest().finish())?;
    • Update new_subscriber variable to use the new SubscriberEmail value
       let new_subscriber = NewSubscriber { email, name };
    • Update insert_subscriber([...]) function's sqlx::query to use the new SubscriberEmail value with .as_ref()
       sqlx::query!(
           r#"
           INSERT INTO subscriptions (id, email, name, subscribed_at)
           VALUES ($1, $2, $3, $4)
           "#,
           Uuid::new_v4(),
           new_subscriber.email.as_ref(),
           new_subscriber.name.as_ref(),
           Utc::now()
       )

I really loved the sections on AsRef and error-handling! I'm enjoying working through this book. This will definitely be one of my go-to's for recommended Rust reading.

Chapter 3 part1 missing import

In chapter 03 part 1 section 3.5.3 the test is expected to pass at the end of the section, but it fails due to a missing import.

This import needs to be added to the last code block in the text for that section:

use zero2prod::configuration::get_configuration;

Then the test passes as expected. 😄

Chapter 4.3 - typo in curl https://<...>

In section 4.3, there is a small typo in the curl command:

section4 3_curl

Using https:// will result in

❯ curl https://127.0.0.1:8000/health_check
curl: (35) error:1408F10B:SSL routines:ssl3_get_record:wrong version number

Dropping the s in https:// will sort it!

❯ curl http://127.0.0.1:8000/health_check
~

Loving the book!

Chapter 4 - minor improvement suggestions

Screenshot from 2020-12-06 19-07-33

Following the tutorial in the part 3.3 I get a deprecation warning:

warning: use of deprecated function `env_logger::from_env`: Prefer `env_logger::Builder::from_env()` instead.

Also, the Ok(()) is confusing, as the last line of my main is

run(listener, connection_pool)?.await

based on the previous chapter.

In the part where you add tracing-futures, the next codeblock should perhaps also show

use tracing_futures::Instrument;

as it is required and is part of the diff to the previous version.

Some food for thought: when I found this series for the first time (through arewewebyet) and read the introduction, I figured that this series is actually not for me. I have never built a backed system, nor do I plan to build a distributed system anytime soon, so I backed up. After couple of days I gave it a second shot and figured out that this is actually an amazing learning experience about how to build a well organized and reliable web app even for somebody, who has never ever built a web app before. So maybe think about it, change the wording a little bit. You might be losing some parts of the audience by the "who is this aimed at" part.

Chapter 3.5 a few more issues

I'm giving this chapter a second reading (while applying the material to a production project) ;-)

Here's a few minor (some debatable) issues I noticed:

  1. 3.1. Choosing A Database

    it is much easier to design yourself in a corner <-- design yourself into a corner

  2. 3.4.2.3. Adding A Migration

    our mailing list would have to incredibly popular <-- have to be incredibly popular

  3. 3.4.2.4. Running Migrations

    if Postgres is not spinned up by our script. <-- not spun up by our script

  4. 3.5.1. Sqlx Features Flags <--- 3.5.1. Sqlx Feature Flags

    Crates.io actually lists "Cargo Feature Flags", but that feels wrong. I would stick with Sqlx Feature Flags.

  5. 3.5.2.2. Reading A Configuration File

    Do you think it would be off-topic to mention here that production
    passwords should not be included in files committed to the repository?

  6. 3.6.1. Application State In Actix-web

    The first code listing is missing the comment //! src/startup.rs

  7. 3.6.4. The INSERT query

    we designed ourselves in a corner <--- we designed ourselves into a corner

  8. 3.7. Updating Our Tests

    perform our SELECT query, it makes to generalise spawn_app a bit <--- it make sense to generalise

Chapter 1.4.1: cargo-audit install fix functionality

A great thing about 02P is the introduction to the Rust tool ecosystem.

The cargo-audit and cargo-deny are fantastic. Recognizing that cargo-deny is a footnote, I'll limit myself to cargo-audit.

Changing the install instruction to this:

cargo install cargo-audit --features=fix

Means that when cargo-audit raises a security advisory:

Crate:         actix-web                                                                
Version:       4.0.0-beta.3                                                             
Title:         Multiple memory safety issues                              
Date:          2018-06-08                                                              
ID:            RUSTSEC-2018-0019                                                        
URL:           https://rustsec.org/advisories/RUSTSEC-2018-0019                         
Solution:      Upgrade to >=0.7.15                                                      
Dependency tree:                                                                        
actix-web 4.0.0-beta.3                                                                  
└── zero2prod 0.1.0

It can be fixed with:

cargo audit fix

And here is the gotcha; cargo audit fix changes actix-web = "=4.0.0-beta.3" to actix-web = ">=0.7.15" in the cargo.toml, and still leaves one issues unresolved.

Unfortunately this means we need to "soft-off" the pre-push git hook guard-rail - for the moment. We say soft-off because we can prevent the error halting our git-hook workflow, but still produce output using:

cargo audit || true

This has also illustrated why it is better to have these guards in git-hooks rather than in the CI/CD service. By placing the guard as a git-hook the undesirable change never makes it into the code base. Placing these guards at the CI/CD service level would mean we only discovered the issue after the code changes have been committed and pushed.

Chapter 3 Cargo test issue

I am not sure is the issue is on me or I missing something.
Once I run cargo test it's error on the tokio as bellow:

error[E0433]: failed to resolve: use of undeclared type or module `tokio`
 --> tests/health_check.rs:9:13
  |
9 |     let _ = tokio::spawn(server);
  |             ^^^^^ use of undeclared type or module `tokio`

error: aborting due to previous error

For more information about this error, try `rustc --explain E0433`.
error: could not compile `bombcampaign-rust`.

Binary name inconsistency

In chapter 3 we defined the app name as app
Screenshot from 2020-12-12 10-41-37
However, the binary in chapter 5 is referred to as zero2prod. The guide itself therefore does not build.
Screenshot from 2020-12-12 10-40-46
Screenshot from 2020-12-12 10-40-34

Keep up the great work!

Wrong imports in code-block chapter 3.5 part 3.6

Screenshot from 2020-12-06 13-46-12

The Arc from imports is not used in the code.

Also the comment Wrap the pool in an Arc smart pointer is slightly misleading, as we are wrapping it in web::Data.

A sidenote, I have spent quite some time trying to figure out what is wrong after forgetting to update Arc to web::Data in the run function. It actually compiled, but the subscribe tests returned status 500 without any more information. Might be worth mentioning it more explicitly in the text, as the related small "exercise" is not so interesting -- the compiler nor the errors are pointing you to the right direction. I started comparing my code to your 1:1 and that is the only way I could find this inconsistency.

Chapter 7.3: mention limits for open file descriptors on Linux and how to change them (ulimit)

When combining all integration tests into one binary as done in Chapter 7.3, one can relatively quickly run into errors akin to this:

{"v":0,"name":"test","msg":"Can not initialize stream handler for Term err: Too many open files (os error 24)",
"level":50,"hostname":"blackie","pid":39982,"time":"2021-02-19T12:43:53.978735862+00:00",
"log.target":"actix_server::signals","log.line":58,"log.module_path":"actix_server::signals",
"log.file":"/home/janis/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-server-1.0.4/src/signals.rs"}

thread 'actix-rt:worker:288' panicked at 'Can not create Runtime: Os { code: 24, kind: Other,
message: "Too many open files" }',
/home/janis/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-rt-1.1.1/src/arbiter.rs:115:45

thread 'login::login_returns_401_with_missing_username_or_password' panicked at
'Failed to execute request.: reqwest::Error { kind: Request, url: Url { scheme: "http",
host: Some(Ipv4(127.0.0.1)), port: Some(36349), path: "/user/register", query: None, fragment: None },
source: hyper::Error(Connect, ConnectError("tcp connect error", Os { code: 111, kind: ConnectionRefused,
message: "Connection refused" })) }', tests/api/login.rs:31:20

The reason that this happens is that instead of having several binaries, each running as their own process, we now have one process that attempts to open file descriptors (which includes sockets). On Linux, the limit for the maximum number of file descriptors that one process is allowed to have open at the same time is 1024 (your individual distribution can deviate from this number and set their own limit):

❯ ulimit -a
Maximum size of core files created                           (kB, -c) unlimited
Maximum size of a process’s data segment                     (kB, -d) unlimited
Maximum size of files created by the shell                   (kB, -f) unlimited
Maximum size that may be locked into memory                  (kB, -l) 64
Maximum resident set size                                    (kB, -m) unlimited
Maximum number of open file descriptors                          (-n) 1024
Maximum stack size                                           (kB, -s) 8192
Maximum amount of cpu time in seconds                   (seconds, -t) unlimited
Maximum number of processes available to a single user           (-u) 240647
Maximum amount of virtual memory available to the shell      (kB, -v) unlimited

You can change the limit with ulimit -n <number>. For example, ulimit -n 8192 allows my tests to pass.

Chapter 2: Executable stories - or when is somthing the subject of unit and/or integration tests

Chapter 2 began with some familiar user story syntax. I recalled Rust had a Gherkin interpreter, in the Cucumber-Rust crate, with an active github project.

Not sure if this detail is out of scope, but it brings up an issue that could be discussed here: Integration tests vs unit tests.
I struggled for years to come up with a workable approach that also let the test suites finish within a reasonable time-frame - this was the critical gotcha around unit and integration tests.

My rules of thumb were (albeit for large unit and integration test suites - but they all eventually become large ;)):

  1. if code is called in a step definition, it is an integration test subject. That fn may still have unit tests, but those unit tests don't need to test functionality the integration test exercises. A bit rusty here... but that guideline helped a lot in working out what delimits the unit and integration test suites.
  2. A git pre-commit hook runs the changed-subset of unit tests, i.e. test file related to a file that is changed by a commit, are run prior to a commit.
  3. All integration tests (running all feature/story files) are run in the CI/CD environment on push/PR.
  4. If unit tests were green (required for a commit to be made), and a feature/integration test broke -> there is a critical gap in the unit test suite (Until then I found it very difficult to work out/decide when a unit test suite was incomplete in an important way).
  5. All unit tests and all integration tests are run on branch merge or some such significant event.

It would be very interesting to have your thoughts/experiences on/with these issues.

Even if you decide executable scenarios/stories are off-topic/out-of-scope, some users may be working in organizations where they are required, so may would help them to know of cucumber-rust?

Chapter 1: Rust format defaults

I found this a handy thing to add Rust format defaults to the project rustfmt.toml

rustfmt --print-config default rustfmt.toml

Not sure where this would be best placed.

suggest stripping --release binary as it will result in 50% smaller program

It might be worth mentioning stripping the release executable as this can result in a 50% decrease in program size. i.e.

strip target/release/zero2prod

Using the zero2prod code I went from 17M -> 8M. This makes less difference with an 80M docker image but as the program grows it will become more significant.

Chapter 6.5 - quickcheck 1.0 has changed trait Gen to struct Gen with no trait RngCore

As result of the change the book code is not compiled.
I suggest adding specific versions to these commands in the book or update code ;)

cargo add [email protected]  --dev
cargo add quickcheck-macros0.9.1 --dev

Changes:
before

pub trait Gen: RngCore {
   fn size(&self) -> usize;
}

now -> link to the source code

/// Gen represents a PRNG.
///
/// It is the source of randomness from which QuickCheck will generate
/// values. An instance of `Gen` is passed to every invocation of
/// `Arbitrary::arbitrary`, which permits callers to use lower level RNG
/// routines to generate values.
///
/// It is unspecified whether this is a secure RNG or not. Therefore, callers
/// should assume it is insecure.
pub struct Gen {
    rng: rand::rngs::SmallRng,
    size: usize,
}

3.8.5.1 Sqlx Feature Flags Code Snippet Syntax Error

First off, thank you for writing Zero To Production in Rust! I'm really enjoying your book and am looking forward to working through what's available now and the upcoming chapters.

In Chapter 3 (specifically 3.8.5.1), there's a code snippet there to update the Cargo.toml file to include sqlx with some feature flags. The example is missing version = "0.5.1" and features = [ for the beginning of the feature flag list.

# Using table-like toml syntax to avoid a super-long line!
[dependences.sqlx]                              
    "runtime-actix-rustls",
    "macros",
    "postgres",
    "uuid",
    "chrono",
    "migrate"
]

Here's what I ended up with that worked:

[dependencies.sqlx]
version = "0.5.1"
features = [
    "runtime-actix-rustls",
    "macros",
    "postgres",
    "uuid",
    "chrono",
    "migrate"
]

In your Chapter 3, Part 1 snapshot of the Cargo.toml file, you have:

sqlx = { version = "0.5.1", default-features = false, features = ["runtime-actix-rustls", "macros", "postgres", "uuid", "chrono", "migrate"] }

I didn't disable the default features, so we will see if that comes back to haunt me as I move along through the book. 😄

RUSTSEC-2020-0036: failure is officially deprecated/unmaintained

failure is officially deprecated/unmaintained

Details
Status unmaintained
Package failure
Version 0.1.8
URL rust-lang-deprecated/failure#347
Date 2020-05-02

The failure crate is officially end-of-life: it has been marked as deprecated
by the former maintainer, who has announced that there will be no updates or
maintenance work on it going forward.

The following are some suggested actively developed alternatives to switch to:

See advisory page for additional details.

Running the health check test with tokio 1.0.1 panics

This isn't really an issue with the guide itself but maybe this should be addressed as it's, i think, a common thing people following along might run into.

Running cargo test on the root-chapter-03-part0 at 447fdbf works fine, but if i use tokio 1.0.1 instead of 0.2.22 like so:

diff --git a/Cargo.toml b/Cargo.toml
index b242cf2..131708e 100644
--- a/Cargo.toml
+++ b/Cargo.toml
@@ -15,7 +15,7 @@ name = "zero2prod"
 [dependencies]
 actix-web = "3.0.0"
 actix-rt = "1.1.1"
-tokio = "0.2.22"
+tokio = { version = "1.0.1", features = ["full"] }
 
 [dev-dependencies]
 reqwest = "0.10.7"

i get a panic when running cargo test

running 1 test
test health_check_works ... FAILED

failures:

---- health_check_works stdout ----
thread 'health_check_works' panicked at 'must be called from the context of Tokio runtime configured with either `basic_scheduler` or `threaded_scheduler`', /home/rihards/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.0.1/src/task/spawn.rs:131:28
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Panic in Arbiter thread.

Still trying to figure out how to solve this myself and whether or not this is something to be reported upstream.

EDIT: checked that the same thing happens with tokio 0.3.

Typos in #7.1

Instead of creating a bunch of small issues for typos, I'll list them all as separate comments on this one issue.

In the Sharing Startup Logic code block it identifies the wrong file:

//! tests/app/helpers.rs

should be:

//! tests/api/helpers.rs

5.3.7 Database connectivity panics at PgPool::connect

I've followed along and am now receiving an error when running the docker container. Also, thanks so much for writing the book. It's exactly what I was looking for and it's the perfect step in my beginner->intermediate journey.

#9 194.2 thread 'main' panicked at ' #9 194.2 failed to connect to Postgres URL postgres://postgres:password@localhost:5432/newsletter #9 194.2 : Io(Os { code: 99, kind: AddrNotAvailable, message: "Cannot assign requested address" })', src/main.rs:16:10 #9 194.2 stack backtrace: ...

RUSTSEC-2020-0049: Use-after-free in Framed due to lack of pinning

Use-after-free in Framed due to lack of pinning

Details
Package actix-codec
Version 0.2.0
URL actix/actix-net#91
Date 2020-01-30
Patched versions >= 0.3.0-beta.1

Affected versions of this crate did not require the buffer wrapped in Framed to be pinned,
but treated it as if it had a fixed location in memory. This may result in a use-after-free.

The flaw was corrected by making the affected functions accept Pin&lt;&amp;mut Self&gt; instead of &amp;mut self.

See advisory page for additional details.

Chapter 1.4: Git hooks c.f. CI/CD pipelines

Thank you for taking the time and making the effort to write this book.

I was a little surprised to find no mention of Git hooks c.f. CI/CD services for fast feedback activities like formatting and linting - usually run via a pre-commit hook.
This definitely smooths the review process, and stops the CI/CD from breaking for trivia that should not make its way into a commit - IMO.

The use of left-hook (go binary), husky or pre-commit ensure the hook configuration and scripts reside in the project.

I wonder if longer compile times could make git hooks appear to be as fast/slow as external CI/CD services?
Any other reasons to stay clear of git hooks?

Chapter 3.4 & 3.5: Declartive vs Imperative

In 3.5:

Our spec for the health check endpoint was:

When we receive a GET request for /health_check we return a 200 OK response with no body.

I actually couldn't recall seeing that spec. Might be useful to link to the relevant section?

But it raises the issue of writing specifications that are declarative rather than imperative.

In addition, for an integration spec there is a whole lot of implementation detail, and with:

The test also covers the full range of properties we are interested to check:

  • the health check is exposed at /health_check
  • the health check is behind a GET method
  • the health check always returns a 200
  • the health check’s response has no body

I think you'll agree that in most of that spec you are actually testing actix_web rather than your app.
These, IMO, usually belong in unit tests, where the implementation detail can/does matter, sometimes its even been necessary for a unit to test validate the version of a library - I know the InSpec people say that should be a system level detail akin to their tests for configured SSH protocol version.
In regulated environments there can be a need to test that a specific method is used or not used in the implementation - but these sit most comfortably in the unit test suite.

I think you're initial context description (ch3, p18) had it mostly right:

As a blog visitor,
I want to subscribe to the newsletter,
So that I can receive email updates when new content is published on the blog.

Perhaps something similar can be added to ch 3.3 as the health check context:

As a blog administrator,
I want to be alerted when the sign-up service is unhealthy,
So that I can restore the sign-up service

With a specification for alerts:

Given the sign-up service is monitored
When the sign-up service is unhealthy
Then the administrator receives an alert

and a specification for no alerts:

Given the sign-up service is monitored
When the sign-up service is healthy
Then the administrator receives no alert

This way you don't change your integration specification files if you change your definition of "healthy" from 200 to 201 or 202 - contrived examples but not implausible. Of course your unit tests would change in that scenario.

Not sure if that helps?

Chapter 3 part 1 - 3.6 More typos, edits and comments

First I want to comment on your description of web::Data as an 'extractor' and how actix-web handles it at the end of section 3.6.3. The Data Extractor.

It's a clear enough description of new material, but being new to me, I wasn't quite sure what I would do with it beyond copy from your example. But when you mentioned that this feature is similar to dependency injection that really made it click for me. Not so much the similarity of the underlying mechanism but just tying the discussion to some familiar terminology. This is the kind of material that really puts the polish on the book and makes it special.
Well done! 💯

3.6.4. The INSERT query, typo:

  1. The comment beginning:
// There is a bit of cerenomy here to get our hands on a &PgConnection.

ceremony is misspelled.

  1. The comment above ends with:
// We could have avoided the double Arc wrapping using .app_data()
// instead of .data() in src/startup.rs

It might be good to mention that this will be addressed later in this section.

  1. In the last code snippet of the section,
...
    .map_err(|e| {
        println!("Failed to execute query: {}", e);
...

The earlier version of this snippet uses eprintln!(), here it's changed to println!(). Intentional?

  1. Just a thought: I notice in other Rust projects as well, various database-specific function calls and queries are scattered throughout the code base. Would it not make sense to package database-specific code in a separate module to allow for greater maintainability in the event a different database was employed for whatever reason? Is this just not thought about, too much trouble, has performance implications or other? Would love to hear your thoughts on this.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.