GithubHelp home page GithubHelp logo

snowflake-labs / sfquickstarts Goto Github PK

View Code? Open in Web Editor NEW
280.0 16.0 531.0 2.21 GB

Follow along with our tutorials to get you up and running with the Snowflake Data Cloud.

License: Apache License 2.0

Starlark 2.19% Makefile 0.17% Go 22.00% HTML 35.03% JavaScript 17.97% Shell 0.37% SCSS 8.18% Closure Templates 0.92% PLpgSQL 2.73% Jupyter Notebook 4.73% Python 3.67% Batchfile 0.04% Java 1.99%
snowflake snowflake-quickstart snowflake-guide

sfquickstarts's Introduction

Snowflake QuickStart Guides

quickstarts.snowflake.com.2.mp4

What are Snowflake Quickstarts?

Snowflake Quickstarts are interactive tutorials and self-serve demos written in markdown syntax. Quickstarts provide a unique step-by-step reading experience and automatically saves tutorial progress for readers. These tutorials are published at quickstarts.snowflake.com

You can submit your own Quickstarts to be published on Snowflake's website by submitting a pull request to this repo. This repository contains all the tools and documentation you’ll need for building, writing, and submitting your own Quickstart!

What's special about the Quickstart format?

  • Powerful and flexible authoring flow in Markdown text
  • Ability to produce interactive web or markdown tutorials without writing any code
  • Easy interactive previewing
  • Usage monitoring via Google Analytics
  • Support for multiple target environments or events (conferences, kiosk, web, offline, etc.)
  • Support for anonymous use - ideal for public computers at developer events
  • Looks great, with a responsive web implementation
  • Remembers where the student left off when returning to a quickstarts
  • Mobile friendly user experience

Getting Started

Prerequisites

  1. Install Node Version Manager (nvm)
    • Not sure if you have it installed? Run nvm or nvm -v at the command line and hit enter. If you encounter a "command not found" error, you likely do not have it installed.
  2. Install Node (required to run the site locally) using nvm: nvm install latest
    • If you have Homebrew installed and don't want to use nvm, run: brew install node
  3. Install gulp-cli npm i -g gulp-cli
  4. Install Go
    • If you have Homebrew installed, run: brew install golang
    • Install claat go install github.com/googlecodelabs/tools/claat@latest
    • Ensure go and claat is in your PATH claat path setup
  5. Optional: install the live-reload plugin for Chrome: LiveReload

Run locally

  1. Fork this repository to your personal GitHub account (top right of webpage, fork button)
  2. Clone your new fork git clone [email protected]:<YOUR-USERNAME>/sfquickstarts.git sfquickstarts
  3. Navigate to the site directory cd sfquickstarts/site
  4. Install node dependencies npm install
  5. Run the site npm run serve
  6. Open a browser to http://localhost:8000/

Congratulations! You now have the Snowflake Quickstarts landing page running.

Common Errors

1. Claat related errors

  • Make sure Go is properly in your PATH. Add the following lines to your profile (~/.profile, or ~/.zshrc):
#adding Golang to path
export PATH=$PATH:/usr/local/go/bin
export PATH=$PATH:$HOME/go/bin

Note: After adding Go to your PATH, be sure to apply your new profile: source ~/.profile or source ~/.zshrc

2. You get a EACCES error when installing gulp-cli

3. You get Error: Cannot find module 'postcss' when running npm run serve

  • The module may not have been installed for some reason so run npm install --save-dev postcss gulp-postcss and then rerun npm run serve

Write Your First Quickstart

  1. Terminate the running server with ctrl C and navigate to the sfguides source directory cd sfguides/src
    • In this directory, you will see all existing guides and their markdown files.
  2. Generate a new guide from the guide template npm run template <GUIDE-NAME>
    • Don't use spaces in the name of your guide, instead use hyphens, they are better for SEO.
  3. Navigate to the newly generated guide (cd sfguides/src/<GUIDE-NAME>) and edit your guide in a tool like VS Code.
  4. Run the website again npm run serve
  5. As you edit and save changes, your changes will automatically load in the browser.

Template To Follow

Other Tips

How do I get my Snowflake Quickstart on quickstarts.snowflake.com?

  1. You will need to sign Snowflake's CLA. The action required on your part is to specify in your (first) pull request comment that you accept it.
  2. Fork this repository
  3. Clone it to your local machine
  4. Make your changes/edits/updates on your locally cloned repo
  5. Run the site locally again via npm run serve and make sure your QuickStart guide shows up as you expect it. Pay close attention to the layout and format. If you are unsure, use this QuickStart as a template to follow.
  6. Push the changes/edits/updates back to your repo
  7. Open this repository on GitHub.com
  8. Click the Pull Request button to open a new pull request
  9. Snowflake will review and approve the submission

To learn more about how to submit a pull request on GitHub in general, check out GitHub's official documentation.

Reporting issues or errata in Quickstarts

Quickstarts are not in the scope of Snowflake Global Support. Please do not file support cases for issues or errata in a Quickstart. If you encounter an issue in a Quickstart (outdated copy or data, typos, broken links, etc.), please file an issue in this repository. Be sure to include the following information:

  1. The title of the Quickstart
  2. A link to the Quickstart
  3. A description of the problem
  4. A proposed solution, if applicable (optional)

sfquickstarts's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sfquickstarts's Issues

Intermittent Error when running dbt models

Describe the bug
intermittent Error:
When runiing dbt run --model +tfm_stock_history_major_currency

Running with dbt=0.19.1
[WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 1 unused configuration paths:

  • models.dbt_hol.l30_mart

Found 7 models, 0 tests, 0 snapshots, 0 analyses, 316 macros, 0 operations, 0 seed files, 2 sources, 0 exposures

Encountered an error:
Runtime Error
cannot import name 'SnowflakeOCSPAsn1Crypto' from partially initialized module 'snowflake.connector.ocsp_asn1crypto' (most likely due to a circular import) (c:\snowflake\lab1\dbt-env\lib\site-packages\snowflake\connector\ocsp_asn1crypto.py)

URL of where you see the bug
https://guides.snowflake.com/guide/data_engineering_with_dbt/#7
To Reproduce
Steps to reproduce the behavior:

  1. run command: $ dbt run --model +tfm_stock_history_major_currency

Expected behavior

Running with dbt=0.19.1
[WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 1 unused configuration paths:

  • models.dbt_hol.l30_mart

Found 7 models, 0 tests, 0 snapshots, 0 analyses, 316 macros, 0 operations, 0 seed files, 2 sources, 0 exposures

21:04:41 | Concurrency: 200 threads (target='dev')
21:04:41 |
21:04:41 | 1 of 6 START view model l10_staging.base_knoema_stock_history........ [RUN]
21:04:41 | 2 of 6 START view model l10_staging.base_knoema_fx_rates............. [RUN]
21:04:44 | 2 of 6 OK created view model l10_staging.base_knoema_fx_rates........ [SUCCESS 1 in 2.62s]
21:04:44 | 3 of 6 START table model l20_transform.tfm_fx_rates.................. [RUN]
21:04:44 | 1 of 6 OK created view model l10_staging.base_knoema_stock_history... [SUCCESS 1 in 2.69s]
21:04:44 | 4 of 6 START view model l20_transform.tfm_knoema_stock_history....... [RUN]
21:04:46 | 4 of 6 OK created view model l20_transform.tfm_knoema_stock_history.. [SUCCESS 1 in 2.41s]
21:04:46 | 5 of 6 START view model l20_transform.tfm_stock_history.............. [RUN]
21:04:49 | 5 of 6 OK created view model l20_transform.tfm_stock_history......... [SUCCESS 1 in 2.47s]
21:04:52 | 3 of 6 OK created table model l20_transform.tfm_fx_rates............. [SUCCESS 1 in 7.91s]
21:04:52 | 6 of 6 START view model l20_transform.tfm_stock_history_major_currency [RUN]
21:04:55 | 6 of 6 OK created view model l20_transform.tfm_stock_history_major_currency [SUCCESS 1 in 2.79s]
21:04:56 |
21:04:56 | Finished running 5 view models, 1 table model in 19.96s.

Completed successfully

Done. PASS=6 WARN=0 ERROR=0 SKIP=0 TOTAL=6

Screenshots
image

Additional context
rerunning the prevevous model from step 6 seemed help.. but not sure why
image

SSH Keys creation using terraform

Describe the bug
When deploying a user using the terraform method, the key is not using pkcs#8 which brings an error:

  • using jdbc connector:"Private key provided is invalid or not supported"
  • using snowsql: "Failed to load private key: Private key provided is not in PKCS#8 format".

After manually converting the private ssh key generated by terraform, i got the following error: "250001 (08001): Failed to connect to DB: XXXXX.YYYY.azure.snowflakecomputing.com:443. JWT token is invalid."

I resolved it by altering the terraform generated user in Snowflake and inserting the public key as a one liner and without the "-----BEGIN PUBLIC KEY-----" and "-----END PUBLIC KEY-----".

URL of where you see the bug
https://quickstarts.snowflake.com/guide/terraforming_snowflake/index.html?index=..%2F..index#8

To Reproduce
Steps to reproduce the behavior:
Apply the documentation with terraform.

Expected behavior
Deploy a user with the correct public key in snowflake and have the private key in the right format from terraform.

Desktop (please complete the following information):

  • OS: Mac OS Big 1.6
  • Terraform: 1.0.9
  • registry.terraform.io/chanzuckerberg/snowflake: 0.25.22

Additional context
Terraform user configuration:

resource "snowflake_user" "joe" {
provider = snowflake.security_admin
name = "joe-bot"
default_role = "CREATE_ROLE"
default_namespace = "DB.PUBLIC"
rsa_public_key = substr(tls_private_key.joe_key.public_key_pem, 27, 398)
}

DBT and Snowflake Cloud

Describe the bug
Error querying the secure view "ECONOMY"."USINDSSP2020"

URL of where you see the bug

To Reproduce
Steps to reproduce the behavior:

  1. Execute the query: select * from "ECONOMY_DATA_ATLAS"."ECONOMY"."USINDSSP2020";
  2. See error:
    SQL compilation error: View definition for 'ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020' declared 13 column(s), but view query produces 12 column(s).

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
image

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Step 9 did not work correctly

Step 9 was supposed to alter a table and add a column. This is being skipped when I manually execute the Jenkins pipeline. I have the correct ./migration files.

This is the URL of my Jenkins GUI
URL: http://localhost:8080/job/snowflake-devops-demo/

To Reproduce

  1. Go to Jenkins Dashboard showing pipeline.
  2. click on the Build with Parameters option in the left navigation bar.
  3. Verify that all the parameter values look correct and click on the blue Build button to start the Pipeline.
  4. See the logs.

Expected behavior
I expected the logs to have applied the latest changes in file V1.1.2__new_objects.sql. But it skipped this step. No errors. Just skipping, nothing to be done.

Screenshots
image

image

Desktop (please complete the following information):

  • OS: macOS Big Sur Version 11.5
  • Browser: Chrome
  • Version: Chrome version Version 92.0.4515.107 (Official Build) (x86_64)

Additional context
When I restarted the Docker image, I now get build errors.

image

image

Getting Started With User-Defined Functions: using ACCOUNTADMIN role not required

In the tutorial Getting Started With User-Defined Functions, on page https://quickstarts.snowflake.com/guide/getting_started_with_user_defined_functions/index.html#1
it says: "Switch the account role from the default SYSADMIN to ACCOUNTADMIN.". I'm pretty sure this isn't required. I was able to finish the tutorial using just the SYSADMIN role.

URL of where you see the bug
https://quickstarts.snowflake.com/guide/getting_started_with_user_defined_functions/index.html#1

Step 6 of Accelerating Data Teams with dbt & Snowflake issue

Step 6 of 'Accelerating Data Teams with dbt & Snowflake' instructs the user to add a packages.yml file for dbt_utils with dbt_utils version 0.7.1.

This version is not supported by DBT cloud. version should be updated to 0.8.0, which will work.

Error from attempting version 0.7.1

19:48:34  Installing version 0.7.1
Runtime Error
  Failed to read package: Runtime Error
    This version of dbt is not supported with the 'dbt_utils' package.
      Installed version of dbt: =1.0.0
      Required version of dbt for 'dbt_utils': ['>=0.20.0', '<0.22.0']
    Check the requirements for the 'dbt_utils' package, or run dbt again with --no-version-check

https://quickstarts.snowflake.com/guide/data_engineering_with_dbt/index.html?index=..%2F..index#5

To Reproduce
Steps to reproduce the behavior:

  1. follow guide.
  2. run dbt deps
  3. will fail with error

Expected behavior
version 0.8.0 installs successfully

Additional context
Add any other context about the problem here.

Unable to terraform snowflake data warehouse

Hi i did as per the instructions here https://guides.snowflake.com/guide/terraforming_snowflake/index.html?index=..%2F..index#3

but I couldn't terraform. Got a

error creating warehouse: Post "https://xxxxx.AWS_AP_SOUTHEAST_1.snowflakecomputing.com:443/session/v1/login-request?requestId=4c4c47c8-05b4-488f-be6a-4f98ae3cef7c&request_guid=9dd11f11-69d1-4412-8f40-6b7c030185cc&roleName=SYSADMIN": x509: certificate is valid for *.us-west-2.snowflakecomputing.com, *.snowflakecomputing.com, *.global.snowflakecomputing.com, *.prod1.us-west-2.aws.snowflakecomputing.com, *.prod2.us-west-2.aws.snowflakecomputing.com, *.us-west-2.aws.snowflakecomputing.com, not xxxxx.AWS_AP_SOUTHEAST_1.snowflakecomputing.com

Any help please?

Section Numbering in Zero 2 Snowflake Guide vs. SQL script

Hi team,

I just came across that the numbering of sections differ in the Zero 2 Snowflake public guide and the corresponding SQL script:

  • In the public guide (Link): 4. Preparing to Load Data
  • In the SQL script (Link): Module 3 (3.x.x)

I think that this also has an impact on subsequent numbering of sections.

New Category - Resource Optimization

Given the efforts were looking to put out there around Resource Optimization and providing guides/queries to our customers on the topic, it'd be great to have a separate category for it to enable easy filtering. Thank you!

JAVA_HOME Issues with sbt and "runMain UDFDemo"

The operation sbt "runMain UDFDemo" would not complete because a compatible SDK could not be found and JAVA_HOME was not correctly defined.

The instructions provided did not specify the download of any Java SDK and only the JRE was provided through the documented link.

Even when I installed jdk-8u_161-windows-x64.exe and jdk-8u_161-windows-i586.exe (tried both combinations) and set JAVA_HOME (and windows path to %JAVA_HOME%\bin) the problem remained.

I imagine that there is some easy solution (but it eludes me right now), as all of the other sbt options worked.

Database Error in model stg_knoema_stock_history

Describe the bug
There is some issue with the 'usindssp2020 ' view from 'Knoema Economy Data Atlas' data source that makes it impossible to finish the lab.

To Reproduce
Steps to reproduce the behavior:

  1. Go to https://quickstarts.snowflake.com/guide/data_engineering_with_dbt/#7
  2. Follow lab steps until 'stg_knoema_stock_history.sql' is saved with code from provided snippet
  3. Click on 'Preview' button
  4. See error

Expected behavior

  • Query results returned

Screenshots
obraz_2021-11-16_225812

Desktop (please complete the following information):

  • OS: W10
  • Browser: Opera
  • Version: 80.0.4170.63

Additional context

  • dbt Cloud v1.1.39.11
  • dbt version 0.21.0
  • data source connected to Snowflake on 2021-11-16

DirectQuery.pbit missing from "Attaining Consumer Insights with Snowflake and Microsoft Power BI"

In Step 7 of the guide for the virtual hands-on lab "Attaining Consumer Insights with Snowflake and Microsoft Power BI", it instructs us to open the file DirectQuery.pbit. However, this file cannot be located anywhere on the guide or on GitHub. The instructors stated that it would be made available for us.

Please let me know where I can download this file to complete the lab!

Thanks

Dataiku HoL now creates bad models because Peru reported bad data

Describe the bug
A clear and concise description of what the bug is.
The HoL "Accelerating Data Science with Snowflake and Dataiku" used to work great where users can go to the Snowflake Marketplace, get some John Hopkins Covid data then build and score models in Dataiku. This HoL now produces really bad models. The underlying problem is that the data has changed with time and the solution doesn't deal with some really bad data coming out of Peru on June 3 2021.

URL of where you see the bug
Any instance of dataiku.

To Reproduce
Steps to reproduce the behavior:

  1. Follow the steps of this hands on lab: https://quickstarts.snowflake.com/guide/data_science_with_dataiku/index.html?index=..%2F..index#0

Expected behavior
A clear and concise description of what you expected to happen.
On this chapter of the lab, https://quickstarts.snowflake.com/guide/data_science_with_dataiku/index.html?index=..%2F..index#5 you can scroll to where it says "Drilldown into the details of the best performing model". Typically, the models are somewhere around .6 to .9 ROC score. They are now about a 0.1 which means it's about as accurate as flipping a coin.

I supposed that data drifted and the models went stale. So, by adding filters, I determined that the solution works great if we filter earlier than a certain date. The date that causes all the problems is 2 June. If we break that problem out by country, we can see that Peru reported tons of deaths on that date but didn't assign a State. Then on 3 June they assigned a State for all the deaths. As the solution subtracts each day from the other to see the changes, there are tons of negative deaths on 2 June then tons of more negative deaths on June 3. There shouldn't be an extraordinary number of negative death counts. Just a normal number of positive death counts.

Screenshots
If applicable, add screenshots to help explain your problem.
Here is the problem with the underlying data:
I have images but GIT doesn't seem to allow me to embed or attach images

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.
I was able to fix the problem by adding a filter as my first step in the lab to filter out where the country is Peru. Another solution would be to delete all the weird death counts from Peru for 2 June and replace it with the identical data from 1 June or 3 June.

"What we covered" section is incorrect on the last page

What We've Covered
Data Loading: Load Twitter streaming data in an event-driven, real-time fashion into Snowflake with Snowpipe
Semi-structured data: Querying semi-structured data (JSON) without needing transformations
Secure Views: Create a Secure View to allow data analysts to query the data
Snowpipe: Overview and configuration

We used Snowpark(Scala) and loaded a CSV file to perform sentiment analysis on sample tweet data. I recommend changing to:

What We've Covered
Data Loading: Load Twitter data into Snowflake with Snowpark(Scala)
Data: Created a data frame from the CSV file and removed unwanted columns
Sentiment Analysis: Showed how to use Scala to perform sentiment analysis on a data frame of tweets
Snowpark: Used Scala to write the data frame into a Snowflake table

.....or something like this

Error Loading Data Step #5 in Getting Started with Snowflake - Zero to Snowflake

When running the copy into trips from @citibike_trips file_format=CSV; (4.2.2) command from worksheet, getting the following error:

Numeric value '' is not recognized File 'citibike-trips/trips_2014_0_5_0.csv.gz', line 39, character 168 Row 39, column "TRIPS"["BIRTH_YEAR":15] If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

Getting Started with Snowflake Tutorial: Typo in section #4 under "Create a File Format"

Describe the bug
Text for the "Creating a File Format" step looks incorrect.
It shows as:
Name: CSV Field optionally enclosed by: Double Quote Null string: [ ] Error on Column Count Mismatch:
If you do not see the "Error on Column Count Mismatch" box, scroll down in the dialogue box.

URL of where you see the bug
https://quickstarts.snowflake.com/guide/getting_started_with_snowflake/index.html#3

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'https://quickstarts.snowflake.com/guide/getting_started_with_snowflake/index.html#3'
  2. Scroll down to 'Create a File Format'
  3. Note that the instructions that appear after 'On the resulting page we create a file format. In the box that appears, leave all the default settings as-is but make the changes below:' appear to have an error message in them.

Expected behavior
The text for the step should not have an error in it.

Screenshots
Snowflake-Tutorial-Text-Issue-Screenshot.PNG

Desktop (please complete the following information):

  • OS: Mac OS Big Sur Version 11.4
  • Browser: Chrome Version 94.0.4606.81 (Official Build) (x86_64)

Snowflake-Tutorial-Text-Issue-Screenshot

invoke_model function takes 3 arguments.

Describe the bug
A clear and concise description of what the bug is.
Invoke model function takes 3 arguments, please update the select statement for real time predictions.
URL of where you see the bug
https://quickstarts.snowflake.com/guide/recommendation_engine_aws_sagemaker/index.html?index=..%2F..index#11

To Reproduce
Steps to reproduce the behavior:
check the last SQL on this page .

select nr.USERID, nr.MOVIEID, m.title, invoke_model(nr.USERID,nr.MOVIEID) as rating_prediction
from no_ratings nr, movies m
where nr.movieid = m.movieid;

Expected behavior
A clear and concise description of what you expected to happen.

select nr.USERID, nr.MOVIEID, m.title, invoke_model('',nr.USERID,nr.MOVIEID) as rating_prediction
from no_ratings nr, movies m
where nr.movieid = m.movieid;

Screenshots
If applicable, add screenshots to help explain your problem.
image

Unavailable tables

I'm currently working with the dbt lab for snowflake and I have some issues relate to the tables. The database: "KNOEMA_ECONOMY_DATA_ATLAS" on the dbt lab works with the tables "exratescc2018" and "usindssp2020", but in the moment that I got this database on the marketplace, it didn't include those tables.

This a screenshot from the dbt lab:
Screen Shot 2021-11-12 at 09 55 06

That are the tables that I got from "KNOEMA_ECONOMY_DATA_ATLAS" database:
image

Lab uses private preview features not available to customers

In steps 10 of the lab "Process PII data using Snowflake RBAC, DAC, Row Access Policies, and Column Level Security" we demonstrate Classification features. These steps cannot be completed by customers unless they are part fo the private preview.

the function used in that step "extract_semantic_categories" does not work unless they have an account where this is enabled.

How to run on a mac

I'm new to a Mac. Took me an hour to figure this out...
Mac users using terminal should add variables to ~/.zshenv. (e.g. export AWSPASS=123456).
The code also needed import os at the top to run in terminal

Broken links

Describe the bug
Here is a list of several broken links in various quick start guides.

URL of where you see the bug

Web page Broken Link
https://quickstarts.snowflake.com/guide/security_network_architecture_pattern/index.html#3 https://docs.aws.amazon.com/vpc/latest/userguide/what-is-amazon-vpc.html%23what-is-privatelink&sa=D&ust=1608660555649000&usg=AOvVaw1Q1lpnr0FYZf_X3kHthIya
https://quickstarts.snowflake.com/guide/security_network_architecture_pattern/index.html#4 https://docs.microsoft.com/en-us/azure/private-link/private-link-overview&sa=D&ust=1608660555650000&usg=AOvVaw2BGDTLXGqLToQI-Svcljfh
https://quickstarts.snowflake.com/guide/vhol_fivetran/index.html#2 https://fivetran.com/dashboard
https://quickstarts.snowflake.com/guide/sample/index.html#0 https://raw.githubusercontent.com/Snowflake-Labs/sfguides/master/site/sfguides/sample.md
https://quickstarts.snowflake.com/guide/sample/index.html https://raw.githubusercontent.com/Snowflake-Labs/sfguides/master/site/sfguides/sample.md
https://quickstarts.snowflake.com/guide/sample/index.html https://raw.githubusercontent.com/Snowflake-Labs/sfguides/master/site/sfguides/sample.md
https://quickstarts.snowflake.com/guide/sample/index.html https://raw.githubusercontent.com/Snowflake-Labs/sfguides/master/site/sfguides/sample.md
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html#0 https://download
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html#3 https://github.com/brenStokes/Building-an-App-with-Data-Marketplace-/tree/main/quasar
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html#4 https://github.com/brenStokes/Building-an-App-with-Data-Marketplace-/tree/main/Lambda-src
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html#4 https://github.com/brenStokes/Building-an-App-with-Data-Marketplace-/tree/main/Lambda-src

To Reproduce
Steps to reproduce the behaviour:

  1. Go to the page in the first column of the table.
  2. Right-click the web page and select "view source".
  3. Search for the URL in the 2nd column.
  4. Go back to the page and click the link.

Missing grant statements at step 5

Describe the bug
Missing grant statements at step 5

URL of where you see the bug
https://quickstarts.snowflake.com/guide/vhol_snowflake_salesforce_tcrm/index.html?index=..%2F..index#4
To Reproduce
Steps to reproduce the behavior:
Run the flow. Once running step 6 - you will get errors.

Expected behavior
add those two lines:
GRANT IMPORTED PRIVILEGES ON DATABASE SAFEGRAPH_STARBUCKS TO ROLE CRM_ANALYST_ROLE;

GRANT IMPORTED PRIVILEGES ON DATABASE WEATHERSOURCE TO ROLE CRM_ANALYST_ROLE;

Terraforming guide has incorrect snowflake service user

Hello.
In the Terraforming guide, the snowflake service user that is created on step 3 is 'tf-snow'.
On step 4 the guide says to use 'tf-user'
export SNOWFLAKE_USER="tf-user"

These user names should match.
Thanks
Steve

invoke_model function with remote service error 400

Describe the bug
Following the "Build a Recommendation Engine with AWS SageMaker" instructions, and at the last step of calling the nvoke_model function, I got the error message:
Request failed for external function INVOKE_MODEL with remote service error: 400 'list index out of range'; requests batch-id: 019d281f-0400-c93b-0053-30830001f57e:0:1; request batch size: 2 rows; request retries: 0; response time (last retry): 66.92ms
Not sure how to resolve it...

URL of where you see the bug
https://quickstarts.snowflake.com/guide/recommendation_engine_aws_sagemaker/index.html?index=..%2F..index#11

To Reproduce
select nr.USERID, nr.MOVIEID, m.title, invoke_model(nr.USERID,nr.MOVIEID) as rating_prediction
from no_ratings nr, movies m
where nr.movieid = m.movieid;

Getting Started with Snowpark Quickstart link to "Source code example on Github" goes to wrong place

"Download" link for the quickstart - Building an application on Snowflake is not working

Describe the bug
"Download" link for the quickstart - Building an application on Snowflake with data from Snowflake Data Marketplace is not working

URL of where you see the bug
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html?index=..%2F..index#0

To Reproduce
Steps to reproduce the behavior:
click on the download link under what you will need

What You'll Need
A AWS free trial Account
A Snowflake trial Account
Download snowflake-connector-python, snowflake queries, quasar sample application code.

Desktop (please complete the following information):
Mac - Chrome

Accelerating Data Teams with dbt & Snowflake: Database error when querying KNOEMA_ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020

Describe the bug
In lab Accelerating Data Teams with dbt & Snowflake following error occurs when querying view KNOEMA_ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020.

URL of where you see the bug
https://quickstarts.snowflake.com/guide/data_engineering_with_dbt/index.html?index=..%2F..index&mkt_tok=MjUyLVJGTy0yMjcAAAGAw7qDQkpiA_IARtqZyzNTwfvNesrTfAZ-VHnyCnDIA1Zb8AiIsoz3v5YhR1-KvSQIraAX-3rZgxfomASLghicVW6ocF49SDgoPemuNB9nzNsTiOY#3

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'Step 4 of Lab'
  2. Scroll down to step where following query needs to be run:
select * 
  from KNOEMA_ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020
 where "Date" = current_date();
  1. Run the query in Snowflake console
  2. See error:
View definition for 'KNOEMA_ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020' declared 12 column(s), but view query produces 13 column(s).

Expected behavior
Query should return data.

Screenshots
Screen Shot 2021-11-17 at 3 47 46 PM

Desktop (please complete the following information):

  • OS: macOS Version 10.15.7 (Build 19H1419)
  • Browser: Chrome
  • Version: 95.0.4638.69 (Official Build) (x86_64)

secure view has changed

Has the underlying view changed? can you please assist as this is a secure view

dbtcli-snowflake % dbt run --model l10_staging --no-version-check
Running with dbt=0.21.0
[WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 2 unused configuration paths:

  • models.dbtcli_snowflake.l30_mart
  • models.dbtcli_snowflake.l20_transform

Found 2 models, 0 tests, 0 snapshots, 0 analyses, 359 macros, 0 operations, 0 seed files, 2 sources, 0 exposures

00:29:02 | Concurrency: 200 threads (target='dev')
00:29:02 |
00:29:02 | 1 of 2 START view model l10_staging.base_knoema_fx_rates............. [RUN]
00:29:02 | 2 of 2 START view model l10_staging.base_knoema_stock_history........ [RUN]
00:29:04 | 2 of 2 ERROR creating view model l10_staging.base_knoema_stock_history [ERROR in 1.98s]
00:29:04 | 1 of 2 OK created view model l10_staging.base_knoema_fx_rates........ [SUCCESS 1 in 2.10s]
00:29:04 |
00:29:04 | Finished running 2 view models in 5.12s.

Completed with 1 error and 0 warnings:

Database Error in model base_knoema_stock_history (models/l10_staging/base_knoema_stock_history.sql)
002057 (42601): SQL compilation error:
View definition for 'KNOEMA_ECONOMY_DATA_ATLAS.ECONOMY.USINDSSP2020' declared 12 column(s), but view query produces 13 column(s).
compiled SQL at target/run/dbtcli_snowflake/models/l10_staging/base_knoema_stock_history.sql

Done. PASS=1 WARN=0 ERROR=1 SKIP=0 TOTAL=2

lab_scripts.sql file missing

In section 3 it says "Browse to the lab_scripts.sql file you downloaded in the previous section and select Open." But there was nowhere to download this in the previous section.

Building an application on Snowflake with data from Snowflake Data Marketplace

Describe the bug
Given view definition will result in an error

URL of where you see the bug
https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html?index=..%2F..index#2

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'https://quickstarts.snowflake.com/guide/vhol_data_marketplace_app/index.html?index=..%2F..index#2'
  2. Click on '3rd Step - Create Snowflake Views'
  3. Scroll down to '4th point where you create database and views'
  4. When trying to create VHOLAPP2 view we get the error
  5. See error
    On Executing VHOLAPP2 view definition we'll see the view is not created and resulting in a compile error

Expected behavior
We expect the view to be created.

Screenshots
If applicable, add screenshots to help explain your problem.
image

Desktop (please complete the following information):

  • OS: Windows
  • Browser: chrome
  • Version: Version 94.0.4606.81 (Official Build) (64-bit)

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
On joining with Saving rate sr the key that is used to join should be agi."geo RegionId" instead it is mentioned as agi."geo - RegionId"

Fix
We need to change the view definition last line from agi."geo - RegionId" to agi."geo RegionId"

Upgrading to v3 with a Mac

I have a Mac. 2.7 is installed by default. I go to the Python website to install 3.9.4 using the installer. It installs. Going forward, I don't use python or pip. I need to use python3 or pip3 or do something like create an alias if all commands are going to use python. You may want to mention which way to go.

Incorrect query in the "Resource Optimization: Setup & Configuration" guide

Describe the bug
Incorrect query in the document -"Resource Optimization: Setup & Configuration" guide

URL of where you see the bug
https://guides.snowflake.com/guide/resource_optimization_setup/index.html?index=..%2F..index#0

To Reproduce
Go to Page 6, and execute the query provided. This will not return the warehouses that are not having resource monitor associated with it. The filter should be updated to "resource_monitor" = 'null'

SHOW WAREHOUSES
;
SELECT "name" AS WAREHOUSE_NAME
,"size" AS WAREHOUSE_SIZE
FROM TABLE(RESULT_SCAN(LAST_QUERY_ID()))
WHERE "resource_monitor" IS NULL
;

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Root Folder Issue - CI pipeline

Describe the bug
I'm trying to follow the steps on snowflake guide page and set up Azure DevOps CI pipeline.

https://guides.snowflake.com/guide/devops_dcm_schemachange_azure_devops/index.html?index=..%2F..index#5

Root Folder Issue
I have set up the Project_Folder parameter to $(System.DefaultWorkingDirectory) and I receive the following error:
ValueError: Invalid root folder: /home/vsts/work/1/s/azure-pipeline-test.

Altough my root folder seems to be correct:
image

Here is my yml:
image

When I delete the "/azure-test-pipeline from root below, the CI pipeline runs without errors, creates table in snowflke but doesn't insert anything there nor runs SQL scripts.

Additional context
IMHO the root folder automatically adds "/home/vsts/work/1/s" which is a bit confusing for me..

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.