GithubHelp home page GithubHelp logo

cloud-native-toolkit / ibm-garage-iteration-zero Goto Github PK

View Code? Open in Web Editor NEW
22.0 6.0 38.0 39.2 MB

Iteration Zero terraform scripts for IBM Cloud provisioned services and clusters

License: Apache License 2.0

Shell 63.70% HCL 36.30%

ibm-garage-iteration-zero's Introduction

IBM Garage Solution Engineering

Latest release

Verify workflow

Iteration Zero for IBM Cloud

This repository contains infrastructure as code (IasC) scripting to create an IBM Garage Cloud Native Toolkit development environment ready for cloud-native application development with IBM Cloud Kubernetes Service or Red Hat OpenShift.

Overview

Iteration Zero creates an IBM Garage Cloud-Native Toolkit environment in IBM Cloud, complete with tools and services needed for continious delivery of typical cloud-native applications to a IBM Cloud Kubernetes Service or Red Hat OpenShift on IBM Cloud cluster. Typically, a squad lead or Site Reliability Engineer (SRE) would create this environment after the toolchain has been decided upon (often during the Inception workshop) and before development team is ready to write code (when Iteration One has started).

The objective of this environment is to reduce the amount of time and effort a team needs to spend creating and configuring their Kubernetes or OpenShift development environments. Rather than the team having to re-invent the wheel deciding how to set up a DevOps environment and performing the manual effort to create, install, and configure the cluster, tools, and services, these Infrastructure-as-Code (IasC) scripts automate the process to consistently create an environment as needed that embodies these best practices. The scripts are modular so tools can be easily added or removed. This combination of tools have been proven in the industry to deliver real value for modern cloud-native development.

The Red Hat Open Innovation Labs has a very similar approach to how they deliver success with OpenShift.

You can jump straight to the Developers Guide if you want more detail on how the Cloud-Native Toolkit fits into the end-to-end development story.

This repo contains Terraform resources that will create an environment containing the following development tools:

The stages provided in this repository provide the configuration for a set of terraform modules that have been provided as part of the Cloud-Native Toolkit. A full listing of those modules can be found in the Garage Terraform Modules repository. In addition to the modules provided with the Cloud-Native Toolkit, any Terraform modules or scripts can be incorporated into the Iteration Zero installation configuration.

Developer Tools

This diagram illustrates the components in a Cloud-Native Toolkit environment:

Provisioned environment

Artifactory is an Open Source product maintained by JFrog

Jenkins is an Open Source project Jenkins

SonarQube is an Open Source project maintained by SonarSource

Nexus Repository is an Open Source project maintained by SonaType

Trivy is an Open Source project maintained by Aqua

InteliJ is a IDE from JetBrains

VSCode is a free IDE maintained by Microsoft

Jaeger is an Open Source tool maintained by Jaeger Community

ArgoCD is an Open Source tool maintained by ArgoCD Community

OpenShift and CodeReady Workspaces are products from Red Hat

LogDNA is an IBM Cloud service supplied by LogDNA

Sysdig is an IBM Cloud service supplied by Sysdig

Developer Guide

Developer Guide explains how to use the Cloud-Native Toolkit environment. Use it to deep dive into how to use these tools and programming models to make yourself productive with Kubernetes and OpenShift on the IBM Cloud.

Install and Configure

Start with the installation instructions for creating the Cloud-Native Toolkit environment. It contains the instructions for how to setup and run the Terraform Infrastructure as Code scripts in this repo.

You can install this collection of CNCF DevSecOps tools using the IBM Cloud Private Catalog feature more information on how to configure an IBM Cloud Private Catalog tile and complete an installation can be found in this README or documentation in the Developer Guide

Developer Dashboard

Developer Dashboard explains how to open the dashboard for using the Cloud Developer Tools environment.

Destroying

The scripts that created the Cloud Developer Tools environment can also be used to destroy it. See destroy for instructions.

Development

Process

Create a pull request

When a pull request is created, there are a couple of categories of labels that can be used to influence the workflow behavior.

Verification process

  • skip ci - Indicates that the validation process should be skipped for the PR. This should be used sparingly but is particularly appropriate if the change impacts a file that is inconsequential to the terraform installation process, like the README or a shell script. The intent of this label is to speed up the process and save resources for the entire verification process.

Change type

The change type labels (feature, bug, chore) tell the component that generates the change log what type of change the pull request represents.

  • feature - The change introduces a new feature or enhancement for the repository
  • bug - The change provides a fix to existing code in the repository
  • chore - The change represents maintenance performed on the repository

Release type

The release type labels (major, minor, patch) tell the component that generates the release how to increment the version number

  • major - The change should be part of a new major release (e.g. 2.0.0)
  • minor - The change should be part of a new minor release (e.g. x.2.0)
  • patch - The change should be part of a new patch release (e.g. x.y.3)

Squash and merge the pull request

When the pull request is validated and reviewed, squash and merge the pull request

Publish the release

Each change will create or contribute to a draft release on the repository. When it is time to publish the release, edit the draft release in GitHub and click the Publish button. That will make the release visible outside of the development team and trigger the Publish assets workflow.

Automated workflows

The repository uses GitHub Actions to automate the steps to validate, release, and publish the assets.

Verify PR

When a pull request is created or a change is pushed to an existing PR, the Verify PR workflow runs to execute the full Iteration Zero workflow against existing IKS, OCP3, and OCP4 clusters.

Verify PR

Labels added to the Pull Request will impact the behavior of the following Release workflow when the PR is merged.

Release

Pushing changes to the master branch, either by merging a PR (ideally) or by a direct push, triggers the Release workflow. The workflow should validate the build by running through the entire terraform process to existing cluster, although that has been turned off for the time being. If the verification completes successfully, the workflow creates (or appends to) a draft release containing the changes. Each commit to the repository will be added to the draft release until someone manually publishes the release.

Release

Publish assets

Publishing a new release will trigger the Publish assets workflow. The workflow generates any assets that should be published with the release (currently the IBM Private Catalog tile assets) based on the released version of the repository.

Publish assets

Create PR

The upstream terraform module repositories have been configured to trigger a repository_dispatch event when a new release of the module has been published. An incoming repository_dispatch event triggers the Create PR workflow to update the version of each module in the repository and create a Pull Request if there are any changes.

Create PR

Summary

We are working to make Kubernetes and OpenShift development as easy as possible this toolkit adds what feels like a PaaS layer to a Kubernetes environment, any feedback on the use of the project will be most welcome.

Thanks IBM Garage Solution Engineering

ibm-garage-iteration-zero's People

Contributors

bwoolf1 avatar cloudnativetoolkit avatar csantanapr avatar github-actions[bot] avatar hemankita avatar holly-cummins avatar johnemibm avatar lsteck avatar michaelsteven avatar seansund avatar triceam avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

ibm-garage-iteration-zero's Issues

Update the dashboard stage with new name of the module

Change to use new module github.com/ibm-garage-cloud/garage-terraform-modules.git//generic/tools/developerdashboard_release

Old location github.com/ibm-garage-cloud/garage-terraform-modules.git//generic/tools/catalystdashboard_release

Move quick install script from dev guide here

Describe the bug
A clear and concise description of what the bug is.

Version
What version of Iteration Zero are you using? If you are running from the cloned repo, then
provide the latest commit hash from ibm-garage-cloud/ibm-garage-iteration-zero or the tag if
working off of a tagged version. If you are running from a tile, provide the tile version.

Environment configuration
Provide the contents of the environment.tfvars file here

Stages
Provide the stages that were run

Terraform log
Provide the full terraform log. This file could be big so compress it in a zip or tgz file, upload
it to some network storage location (like Box) and provide the url here. (Be sure to give us access
to the file.)

The best way to collect the logs is to set the following values prior to running the script:

export TF_LOG=TRACE
export TF_LOG_PATH=./terraform.trace

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Iteration Zero fails on bare-metal OCP cluster

When doing a fast-install on a bare metal OCP cluster (4.9) - no cloud provider, I get the following error:

│ Error: Missing required argument
│ 
│   on stage2-image-registry.tf line 1, in module "dev_tools_ibm_image_registry":
│    1: module "dev_tools_ibm_image_registry" {
│ 
│ The argument "region" is required, but no definition was found.
╵
╷
│ Error: Unsupported argument
│ 
│   on stage2-image-registry.tf line 5, in module "dev_tools_ibm_image_registry":
│    5:   cluster_region      = module.dev_cluster.region
│ 
│ An argument named "cluster_region" is not expected here.

Version
used command curl -sfL get.cloudnativetoolkit.dev | sh - to do the install with latest iteration-zero release 2.18.0

I didn't make any modifications, so all default values being used (this command ran 2 weeks ago without any issues)

in cluster postgressql fails on CRC/OCP43

Using branch ocp4

Setup vars for crc cluster type

After some items successfully it fails on postgresql

module.dev_tools_sonarqube_release.null_resource.sonarqube_release: Creation complete after 5m19s [id=2567534695724848183]

Error: Error running command '.terraform/modules/dev_infrastructure_postgres/self-managed/software/postgres/scripts/deploy-postgres_openshift.sh': exit status 127. Output: --> Deploying template "openshift/postgresql-persistent" to project tools

     PostgreSQL
     ---------
     PostgreSQL database service, with persistent storage. For more information about using this template, including OpenShift considerations, see https://github.com/sclorg/postgresql-container/.
     
     NOTE: Scaling to more than one replica is not supported. You must have persistent volumes available in your cluster to use this template.

     The following service(s) have been created in your project: postgresql.
     
            Username: user8LH
            Password: 6pcSQDtEXDoaQW2K
       Database Name: sampledb
      Connection URL: postgresql://postgresql:5432/
     
     For more information about using this template, including OpenShift considerations, see https://github.com/sclorg/postgresql-container/.

     * With parameters:
        * Memory Limit=512Mi
        * Namespace=openshift
        * Database Service Name=postgresql
        * PostgreSQL Connection Username=user8LH # generated
        * PostgreSQL Connection Password=6pcSQDtEXDoaQW2K # generated
        * PostgreSQL Database Name=sampledb
        * Volume Capacity=1Gi
        * Version of PostgreSQL Image=10

--> Creating resources ...
    secret "postgresql" created
    service "postgresql" created
    persistentvolumeclaim "postgresql" created
    deploymentconfig.apps.openshift.io "postgresql" created
--> Success
    Application is not exposed. You can expose services to the outside world by executing one or more of the commands below:
     'oc expose svc/postgresql' 
    Run 'oc status' to view your app.
.terraform/modules/dev_infrastructure_postgres/self-managed/software/postgres/scripts/deploy-postgres_openshift.sh: line 5: -p: command not found

Installs fails after cluster and postgresql OCP43

Using branch ocp4

Using all default values.

Set vlan ids, and datacenter in vlans.tfvars
Set name_prefix to csantana2 and region us-east

cluster_type="ocp4"
cluster_exists="false"
name_prefix="csantana2"
resource_group_name="catalyst-demos"
vlan_region="us-east"
postgres_server_exists="false"

After a few minutes it fails with the following the message, but the cluster is "Normal" and postgressql instance also "Active"

module.dev_cluster.ibm_container_cluster.create_cluster[0]: Creation complete after 23m27s [id=bpikmb4w0cnajdhgauug]
module.dev_cluster.data.ibm_container_cluster.config: Refreshing state...
module.dev_cluster.null_resource.oc_login[0]: Creating...
module.dev_cluster.null_resource.oc_login[0]: Provisioning with 'local-exec'...
module.dev_cluster.null_resource.oc_login[0] (local-exec): Executing: ["/bin/sh" "-c" "oc login -u apikey -p <REDACTED> --server=https://c100-e.us-east.containers.cloud.ibm.com:31718 > /dev/null"]
module.dev_cluster.null_resource.oc_login[0] (local-exec): Error from server (InternalError): Internal error occurred: unexpected response: 500


Error: Error running command 'oc login -u apikey -p <REDACTED> --server=https://c100-e.us-east.containers.cloud.ibm.com:31718 > /dev/null': exit status 1. Output: Error from server (InternalError): Internal error occurred: unexpected response: 500




Error: malformed CRN: Error parsing JSON

  on .terraform/modules/dev_infrastructure_postgres/cloud-managed/services/postgres/main.tf line 48, in data "ibm_resource_key" "postgresql":
  48: data "ibm_resource_key" "postgresql" {

IBM Console view
image

OCP4: Error downloading the cluster config

Describe the bug
I create a ROKS 4.5 cluster in VPC Gen 2 and while running the toolkit, struck with the below error. Logs from Schematics

 2020/10/19 10:58:47 Terraform apply | Error: Error downloading the cluster config [appdev-cloud-native-toolkit]: Could not login to openshift account
 2020/10/19 10:58:47 Terraform apply | 
 2020/10/19 10:58:47 Terraform apply |   on .terraform/modules/dev_cluster/main-2-config.tf line 53, in data "ibm_container_cluster_config" "cluster":
 2020/10/19 10:58:47 Terraform apply |   53: data "ibm_container_cluster_config" "cluster" {

LogDNA and SysDig are set to false

Version
2.1.0

Schematics job fails with undeclared variable: provision_activity_tracker

Describe the bug
I am using the option to add a tile to my private catalog; I created the tile and ran it to install the tools into an existing ocp4 cluster on VPC. I filled in all of the required and optional parameters in the tile and created the workspace. The terraform apply in Schematics fails due to a missing variable in stage3-activity-tracker.tf.

Version
The tile version is v1.10-beta.5

Terraform log

2020/08/12 21:48:26 -----  New Workspace Action  -----
 2020/08/12 21:48:26 Request: activitId=f5e1f900d526a06ebb7da933a0eb63b6, account=9d5d528aa786af01ce99593a827cd68a, [email protected], requestID=c0568773-22cf-4b00-b9f3-164d4176c2e3
 2020/08/12 21:48:27 Related Activity: action=WORKSPACE_CREATE, workspaceID=cloudnative-toolkit-garage-dev-tools-01-500c44df-54c6-4e, processedBy=orchestrator-7cb4bc87d6-wxk4j
 2020/08/12 21:48:27 Related Workspace: name=cloudnative-toolkit-garage-dev-tools-01, sourcerelease=(not specified), sourceurl=https://github.com/ibm-garage-cloud/ibm-garage-iteration-zero/releases/download/v1.10-beta.5/cloudnative-toolkit.tar.gz
 2020/08/12 21:48:30  --- Ready to execute the command --- 
 2020/08/12 21:48:30 -----  New Action  -----
 2020/08/12 21:48:30 Request: RepoURL=https://github.com/ibm-garage-cloud/ibm-garage-iteration-zero/releases/download/v1.10-beta.5/cloudnative-toolkit.tar.gz, WorkspaceSource=Schematics, Branch=, Release=, Folder=cloudnative-toolkit
 2020/08/12 21:48:30 Related Activity: action=CREATE_TAR_WORKSPACE,processedBy=sandbox-79c44c96fd-thm5c_3051
 2020/08/12 21:48:30 Getting tar download command
 2020/08/12 21:48:33 No Vulnerabilities Found. Successfully saved all the files from the repo.
 2020/08/12 21:48:34 Successfully read the README file
 2020/08/12 21:48:35 Done with the Activity
 2020/08/12 21:48:35  --- Ready to execute the command --- 
 2020/08/12 21:48:36 workspace.template.SecFile: c65035f0-a84d-4e30-9e72-3f8587c436d1
 2020/08/12 21:48:35 -----  New Action  -----
 2020/08/12 21:48:35 Request: requestID=c0568773-22cf-4b00-b9f3-164d4176c2e3
 2020/08/12 21:48:36 Related Activity: action=Apply, workspaceID=cloudnative-toolkit-garage-dev-tools-01-500c44df-54c6-4e, processedByOrchestrator=ID:"c0568773-22cf-4b00-b9f3-164d4176c2e3_f5e1f900d526a06ebb7da933a0eb63b6" , processedByJob=job-12-543-fd66fc564-xv9gk
 
 2020/08/12 21:48:41 -----  Terraform INIT  -----
 2020/08/12 21:48:41 Starting command: terraform init -input=false -no-color
 2020/08/12 21:48:41 Terraform init | Initializing modules...
 2020/08/12 21:48:41 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-ibm-container-platform.git?ref=v1.15.1 for dev_cluster...
 2020/08/12 21:48:42 Terraform init | - dev_cluster in .terraform/modules/dev_cluster
 2020/08/12 21:48:42 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-ibm-logdna.git?ref=v2.2.0 for dev_infrastructure_logdna...
 2020/08/12 21:48:45 Terraform init | - dev_infrastructure_logdna in .terraform/modules/dev_infrastructure_logdna
 2020/08/12 21:48:45 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-ibm-sysdig.git?ref=v2.2.0 for dev_infrastructure_sysdig...
 2020/08/12 21:48:46 Terraform init | - dev_infrastructure_sysdig in .terraform/modules/dev_infrastructure_sysdig
 2020/08/12 21:48:46 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-k8s-olm.git?ref=v1.2.2 for dev_software_olm...
 2020/08/12 21:48:46 Terraform init | - dev_software_olm in .terraform/modules/dev_software_olm
 2020/08/12 21:48:46 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-k8s-namespace.git?ref=v2.3.2 for dev_sre_namespace...
 2020/08/12 21:48:47 Terraform init | - dev_sre_namespace in .terraform/modules/dev_sre_namespace
 2020/08/12 21:48:47 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-argocd.git?ref=v2.8.0 for dev_tools_argocd...
 2020/08/12 21:48:48 Terraform init | - dev_tools_argocd in .terraform/modules/dev_tools_argocd
 2020/08/12 21:48:48 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-artifactory.git?ref=v1.9.0 for dev_tools_artifactory...
 2020/08/12 21:48:48 Terraform init | - dev_tools_artifactory in .terraform/modules/dev_tools_artifactory
 2020/08/12 21:48:48 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-dashboard.git?ref=v1.7.1 for dev_tools_dashboard...
 2020/08/12 21:48:49 Terraform init | - dev_tools_dashboard in .terraform/modules/dev_tools_dashboard
 2020/08/12 21:48:49 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-jenkins.git?ref=v1.4.2 for dev_tools_jenkins...
 2020/08/12 21:48:50 Terraform init | - dev_tools_jenkins in .terraform/modules/dev_tools_jenkins
 2020/08/12 21:48:50 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-k8s-namespace.git?ref=v2.4.0 for dev_tools_namespace...
 2020/08/12 21:48:50 Terraform init | - dev_tools_namespace in .terraform/modules/dev_tools_namespace
 2020/08/12 21:48:50 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-pactbroker.git?ref=v1.3.0 for dev_tools_pactbroker_release...
 2020/08/12 21:48:51 Terraform init | - dev_tools_pactbroker_release in .terraform/modules/dev_tools_pactbroker_release
 2020/08/12 21:48:51 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-sonarqube.git?ref=v1.8.0 for dev_tools_sonarqube...
 2020/08/12 21:48:52 Terraform init | - dev_tools_sonarqube in .terraform/modules/dev_tools_sonarqube
 2020/08/12 21:48:52 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-swaggereditor.git?ref=v1.3.1 for dev_tools_swagger...
 2020/08/12 21:48:52 Terraform init | - dev_tools_swagger in .terraform/modules/dev_tools_swagger
 2020/08/12 21:48:52 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-tekton.git?ref=v2.0.1 for dev_tools_tekton_release...
 2020/08/12 21:48:53 Terraform init | - dev_tools_tekton_release in .terraform/modules/dev_tools_tekton_release
 2020/08/12 21:48:53 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-tools-tekton-resources.git?ref=v2.0.0 for dev_tools_tekton_resources...
 2020/08/12 21:48:53 Terraform init | - dev_tools_tekton_resources in .terraform/modules/dev_tools_tekton_resources
 2020/08/12 21:48:53 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-ibm-activity-tracker.git?ref=v1.1.1 for sre_activity-tracker...
 2020/08/12 21:48:54 Terraform init | - sre_activity-tracker in .terraform/modules/sre_activity-tracker
 2020/08/12 21:48:54 Terraform init | 
 2020/08/12 21:48:54 Terraform init | Initializing the backend...
 2020/08/12 21:48:54 Terraform init | 
 2020/08/12 21:48:54 Terraform init | Initializing provider plugins...
 2020/08/12 21:48:54 Terraform init | - Checking for available provider plugins...
 2020/08/12 21:48:55 Terraform init | - Downloading plugin for provider "helm" (hashicorp/helm) 1.2.4...
 2020/08/12 21:48:58 Terraform init | 
 2020/08/12 21:48:58 Terraform init | The following providers do not have any version constraints in configuration,
 2020/08/12 21:48:58 Terraform init | so the latest version was installed.
 2020/08/12 21:48:58 Terraform init | 
 2020/08/12 21:48:58 Terraform init | To prevent automatic upgrades to new major versions that may contain breaking
 2020/08/12 21:48:58 Terraform init | changes, it is recommended to add version = "..." constraints to the
 2020/08/12 21:48:58 Terraform init | corresponding provider blocks in configuration, with the constraint strings
 2020/08/12 21:48:58 Terraform init | suggested below.
 2020/08/12 21:48:58 Terraform init | 
 2020/08/12 21:48:58 Terraform init | * provider.local: version = "~> 1.4"
 2020/08/12 21:48:58 Terraform init | * provider.null: version = "~> 2.1"
 2020/08/12 21:48:58 Terraform init | 
 2020/08/12 21:48:58 Terraform init | Terraform has been successfully initialized!
 2020/08/12 21:48:58 Command finished successfully.
 
 2020/08/12 21:48:58 -----  Terraform SHOW  -----
 2020/08/12 21:48:58 Starting command: terraform show -no-color
 2020/08/12 21:49:03 Terraform show | No state.
 2020/08/12 21:49:03 Command finished successfully.
 
 2020/08/12 21:49:03 -----  Terraform APPLY  -----
 2020/08/12 21:49:03 Starting command: terraform apply -state=terraform.tfstate -var-file=schematics.tfvars -auto-approve -no-color -parallelism=10
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | Warning: "function_namespace": [DEPRECATED] This field will be deprecated soon
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | Warning: "function_namespace": [DEPRECATED] This field will be deprecated soon
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | Warning: "function_namespace": [DEPRECATED] This field will be deprecated soon
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | Warning: "function_namespace": [DEPRECATED] This field will be deprecated soon
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | Error: Reference to undeclared input variable
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply |   on stage3-activity-tracker.tf line 7, in module "sre_activity-tracker":
 2020/08/12 21:49:29 Terraform apply |    7:   provision                = var.provision_activity_tracker == "true"
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform apply | An input variable with the name "provision_activity_tracker" has not been
 2020/08/12 21:49:29 Terraform apply | declared. This variable can be declared with a variable
 2020/08/12 21:49:29 Terraform apply | "provision_activity_tracker" {} block.
 2020/08/12 21:49:29 Terraform apply | 
 2020/08/12 21:49:29 Terraform APPLY error: Terraform APPLY errorexit status 1
 2020/08/12 21:49:29 Could not execute action

Additional context
I checked the source code in this repo and indeed there is no variable named provision_activity_tracker in the variables.tf. On a side note, I would have set this field to false, as I already have an instance of Activity Tracker provisioned in my account in the same region, so it would have failed anyway.

"/bin/sh: oc: not found" error (repo pulled 8am CT 01 Apr '20)

Installing onto OCP 3.11.

I "git pull"ed the repo this morning. Once the terraform finishes initializing, I soon see “Output: /bin/sh: oc: not found” on an oc login. See console log below. My ./launch.sh runs image 0.1.9-lite. I saw some remarks in a commit from Sean on 29 March back, about fixing an oc path issue. I tried pulling that commit, but same issue. I backed up all the way to Bobby’s 01009be “Fixed cluster create (#39)” from 3/26, that works (and runs image version 0.1.6).

Snippet from the console log. Terraform has just finished creating the registry namespace.

module.dev_cluster.null_resource.create_registry_namespace: Creation complete after 12s [id=8668850531748498421]
module.dev_cluster.data.local_file.registry_url: Refreshing state...
module.dev_cluster.null_resource.oc_login[0]: Creating...
module.dev_cluster.null_resource.oc_login[0]: Provisioning with 'local-exec'...
module.dev_cluster.null_resource.oc_login[0] (local-exec): Executing: ["/bin/sh" "-c" "oc login -u apikey -p obfuscated --server=https://c104-e.us-east.containers.cloud.ibm.com:31418 > /dev/null"]
module.dev_cluster.null_resource.oc_login[0] (local-exec): /bin/sh: oc: not found

module.dev_cluster.null_resource.get_cluster_version: Creation complete after 3s [id=1530229671848175068]
module.dev_cluster.data.local_file.cluster_version: Refreshing state...

Error: Error running command 'oc login -u apikey -p obfuscated --server=https://c104-e.us-east.containers.cloud.ibm.com:31418 > /dev/null': exit status 127. Output: /bin/sh: oc: not found

And then I investigated:

bash-5.0$ which oc
/usr/local/bin/oc
bash-5.0$ oc login -u apikey -p --server=https://c104-e.us-east.containers.cloud.ibm.com:31418
Login successful.

Jenkins is failing to deploy in eu-gb region for a new cluster

Using the eg-gb region and an empty resource group have followed the instructions to deploy a new tools and create a new cluster. However, jenkins will not successfully deploy. The following error occurs in the jenkins-deploy pod:

--> Scaling jenkins-1 to 1
Warning: acceptAvailablePods encountered watch.ErrWatchClosed, retryingWarning: acceptAvailablePods encountered watch.ErrWatchClosed, retryingerror: update acceptor rejected jenkins-1: acceptAvailablePods encountered unknown error for ReplicationController tools/jenkins-1: acceptAvailablePods failed watching for ReplicationController tools/jenkins-1: : too old resource version: 6385 (8321)

All other components are created, except for sonarqube which is in a pending status.

Automatic updates of modules broken

Describe the bug
A clear and concise description of what the bug is.

Version
What version of Iteration Zero are you using? If you are running from the cloned repo, then
provide the latest commit hash from ibm-garage-cloud/ibm-garage-iteration-zero or the tag if
working off of a tagged version. If you are running from a tile, provide the tile version.

Environment configuration
Provide the contents of the environment.tfvars file here

Stages
Provide the stages that were run

Terraform log
Provide the full terraform log. This file could be big so compress it in a zip or tgz file, upload
it to some network storage location (like Box) and provide the url here. (Be sure to give us access
to the file.)

The best way to collect the logs is to set the following values prior to running the script:

export TF_LOG=TRACE
export TF_LOG_PATH=./terraform.trace

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Fail to delete the "test" namespace when installing on an existing OCP cluster

When performing the installation of iteration zero on an existing OCP cluster, the process fails when attempting to delete the test namespace that is not present on the cluster.

module.dev_cluster_namespaces.null_resource.delete_namespaces[2] (local-exec): Executing: ["/bin/sh" "-c" "/home/devops/src/workspace/.terraform/modules/17ebe3ac8e95e4efdb3af7dad15b5b43/generic/cluster/namespaces/scripts/deleteNamespace.sh test"]
module.dev_cluster_namespaces.null_resource.delete_namespaces[0]: Creation complete after 0s (ID: 5564851789359941061)
module.dev_cluster_namespaces.null_resource.delete_namespaces[1]: Creation complete after 0s (ID: 8321762577535941792)
module.dev_cluster_namespaces.null_resource.delete_namespaces[2] (local-exec): *** Deleting namespace and contained resources: test
module.dev_cluster_namespaces.null_resource.delete_namespaces[3]: Creation complete after 0s (ID: 2680832943869001292)
module.dev_cluster_namespaces.null_resource.delete_namespaces[2] (local-exec): Error from server (NotFound): namespaces "test" not found

Error: Error applying plan:

1 error occurred:
        * module.dev_cluster_namespaces.null_resource.delete_namespaces[2]: Error running command '/home/devops/src/workspace/.terraform/modules/17ebe3ac8e95e4efdb3af7dad15b5b43/generic/cluster/namespaces/scripts/deleteNamespace.sh test': exit status 1. Output: *** Deleting namespace and contained resources: test
Error from server (NotFound): namespaces "test" not found

Managed to work around the issue by creating a dummy test project on the cluster and re-launching the runTerraform.sh script.

Any idea on what I did wrong?
Thanks

Jaeger is giving below error while running script ./ runTerraform.sh

Error: Error running command '.terraform/modules/dev_tools_jaeger/scripts/deploy-instance.sh ocp4 tools appdev-cloudnativ-734219-3b1fc50af0b2002f0241bdf5d2432efd-0000.us-east.containers.appdomain.cloud jaeger ': exit status 1. Output: jaeger.jaegertracing.io/jaeger created
Waiting for deployment "jaeger" rollout to finish: 0 of 1 updated replicas are available...
W0619 05:43:35.789164 14882 reflector.go:302] k8s.io/client-go/tools/watch/informerwatcher.go:146: watch of *unstructured.Unstructured ended with: too old resource version: 229152 (229608)
Waiting for deployment "jaeger" rollout to finish: 0 of 1 updated replicas are available...
error: deployment "jaeger" exceeded its progress deadline

latest changes in sonarqube (v 1.6.0) breaks pipelines

Latest version of sonarqube module breaks all pipelines.

Error while running pipelines:
error is: ERROR: Error during SonarQube Scanner execution
java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.ProjectLock
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:678)
at org.sonar.core.platform.ComponentContainer.getComponentByType(ComponentContainer.java:281)
at org.sonar.s

Toolkit deployment fails

Describe the bug
Toolkit deployment fails with the below error (retrieved from workspace logs)

Error: Error running command '.terraform/modules/dev_sre_namespace/scripts/setup-namespace-pull-secrets.sh ibm-observe': exit status 1. Output: *** Copying pull secrets from default namespace to ibm-observe namespace
 2020/10/14 17:28:13 Terraform apply | secret/all-icr-io created
 2020/10/14 17:28:13 Terraform apply | Error from server (NotFound): serviceaccounts "default" not found
 2020/10/14 17:28:13 Terraform apply | *** Adding secrets to serviceaccount/default in ibm-observe namespace
 2020/10/14 17:28:13 Terraform apply | Error from server (NotFound): serviceaccounts "default" not found

Version
v2.1.0

Environment configuration
Tried provisioning on ROKS 4.5 cluster (existing)

Desktop (please complete the following information):

  • OS: macOS
  • Browser Safari

Additional context
Add any other context about the problem here.

argocd installed as part of latest toolkit cannot be used

argocd install using the operator in the latest toolkit cannot be used. argocd runs and the console can be accessed, but you cannot create apps, repositories, ...

Fix:
Edit the argocd instance in tools namespace and make the default policy admin.

oc edit argocd argocd -n tools

change
rbac:
defaultPolicy: role:readonly

to

rbac:
defaultPolicy: role:admin

to fix the problem.

installing on existing ocp4 failed with cluster_type not found

Hey guys, tip/bug if you're installing to an existing ocp4 cluster.

If you set cluster_type to ocp4, I kept getting this error:

Error: Missing required argument

  on stage2-tekton.tf line 10, in module "dev_tools_tekton_resources":
  10: module "dev_tools_tekton_resources" {

The argument "cluster_type" is required, but no definition was found.

Looked at some of the other modules and changed
stages-ocp4/stage2-tekton.tf file
cluster_type = module.dev_cluster.type
to
cluster_type = var.cluster_type

Seems to be running now, can't figure out why original version isn't getting picked up, but I don't have much experience with terraform.

Error Could not login to openshift account on OCP4

Using branch ocp4

After a few failures I get stuck on this error

dule.dev_tools_argocd_release.null_resource.argocd_release: Creation complete after 1m11s [id=8500823640980418991]

Error: Error downloading the cluster config [csantana-cluster]: Could not login to openshift account

  on .terraform/modules/dev_cluster/cloud-managed/cluster/ibmcloud/main.tf line 90, in data "ibm_container_cluster_config" "cluster":
  90: data "ibm_container_cluster_config" "cluster" {



Error: malformed CRN: Error parsing JSON

  on .terraform/modules/dev_infrastructure_postgres/cloud-managed/services/postgres/main.tf line 48, in data "ibm_resource_key" "postgresql":
  48: data "ibm_resource_key" "postgresql" {

Handle resource group names with issues like uppercase letters

The resource group name for a Dev Tools environment is used as the prefix for naming Kubernetes resources, so the resource group name has to follow the restrictions for Kubernetes resource names. Either the script should warn the user at the beginning that the resource group name won't work, or better yet, when the script sets the prefix based on the resource group name, it should fix problems in the name such as converting capital letters to lowercase, converting underscores (and spaces?) to dashes, etc. That way the prefix is always a valid Kubernetes resource name even if the resource group name is not.

machine flavor not being selected in vpc cluster tile

Describe the bug
Error while creating new vpc cluster on ibm cloud.
Didn't select a machine flavor assuming one would be auto selected

Error

 2020/12/15 20:03:55 Terraform apply | Error: Request failed with status code: 400, ServerErrorResponse: {"incidentID":"6022c54923742713-DEN","code":"E4e5b","description":"Could not find flavor in the requested zone.","type":"BadRequest","recoveryCLI":"To list available flavors, run 'ibmcloud ks flavors --zone \u003czone\u003e'."}
 2020/12/15 20:03:55 Terraform apply | 
 2020/12/15 20:03:55 Terraform apply |   on .terraform/modules/dev_cluster/main-1-vpc.tf line 72, in resource "ibm_container_vpc_cluster" "cluster":
 2020/12/15 20:03:55 Terraform apply |   72: resource "ibm_container_vpc_cluster" "cluster" {
 2020/12/15 20:03:55 Terraform apply | 
 2020/12/15 20:03:55 Terraform apply | 
 2020/12/15 20:03:55 Terraform APPLY error: Terraform APPLY errorexit status 1
 2020/12/15 20:03:55 Could not execute action

Flavor is not provisioned properly

Describe the bug
When provisioning from tile 221 Cloud-Native Classic Cluster

To Reproduce
Screenshots are provided to show the inputs to the catalog tile

Expected behavior
the Flavor was set to b3c.16x64. The default flavor of b3c.4x16 was used to provision the openshift cluster.
Screenshots
If applicable, add screenshots to help explain your problem.

IBM Cloud
Select the services and tools affected

  • IBM Kubernetes Service Managed Service
  • [X ] Red Hat OpenShift Managed Services
  • Jenkins
  • Tekton
  • SonarQube
  • Pact
  • Artifactory
  • IBM Image Registry

Desktop (please complete the following information):

  • OS: [e.g. iOS] MacOS Big Sur 11.0.1
  • Browser [e.g. chrome, safari] Chrome
  • Version [e.g. 22] 87.0.4280.67

Additional context
Add any other context about the problem here.
Screen Shot 2020-11-29 at 3.07.28 PM.png
Screen Shot 2020-11-29 at 3.07.44 PM.png
Screen Shot 2020-11-29 at 3.07.54 PM.png
Screen Shot 2020-11-29 at 5.34.12 PM.png
Screen Shot 2020-11-29 at 5.33.53 PM.png
Screen Shot 2020-11-29 at 5.26.40 PM.png

The log file
2020/11/29 20:08:04 ----- New Workspace Action -----
2020/11/29 20:08:04 Request: activitId=1a3b4eaff546729622f7e9808d67501d, account=91d0377daf0d88b58cf800e3b10d8369, owner=[email protected], requestID=1d472e79-48ee-48a2-8ad7-9c8b5be25193
2020/11/29 20:08:05 Related Activity: action=WORKSPACE_CREATE, workspaceID=cluster-classic-11-29-2020-f2423063-c4db-41, processedBy=orchestrator-6fd9b58b8d-z584s
2020/11/29 20:08:05 Related Workspace: name=cluster-classic-11-29-2020, sourcerelease=(not specified), sourceurl=https://github.com/ibm-garage-cloud/ibm-garage-iteration-zero/releases/download/v2.4.0/cluster-classic.tar.gz
2020/11/29 20:08:08 --- Ready to execute the command ---
2020/11/29 20:08:09 ----- New Action -----
2020/11/29 20:08:09 Request: RepoURL=https://github.com/ibm-garage-cloud/ibm-garage-iteration-zero/releases/download/v2.4.0/cluster-classic.tar.gz, WorkspaceSource=Schematics, Branch=, Release=, Folder=cluster-classic
2020/11/29 20:08:09 Related Activity: action=CREATE_TAR_WORKSPACE,processedBy=sandbox-5f46cdd4d9-72z59_3865
2020/11/29 20:08:09 Getting tar download command
2020/11/29 20:08:11 No Vulnerabilities Found. Successfully saved all the files from the repo.
2020/11/29 20:08:13 Successfully read the README file
2020/11/29 20:08:14 Done with the Activity
2020/11/29 20:08:14 --- Ready to execute the command ---
2020/11/29 20:08:15 workspace.template.SecFile: d158127d-97c4-4347-8e68-409be8c10839
2020/11/29 20:08:14 ----- New Action -----
2020/11/29 20:08:14 Request: requestID=1d472e79-48ee-48a2-8ad7-9c8b5be25193
2020/11/29 20:08:16 Related Activity: action=Apply, workspaceID=cluster-classic-11-29-2020-f2423063-c4db-41, processedByOrchestrator=1d472e79-48ee-48a2-8ad7-9c8b5be25193_1a3b4eaff546729622f7e9808d67501d, processedByJob=job12-6cd7ffdbd9-x5vb9

2020/11/29 20:08:21 ----- Terraform INIT -----
2020/11/29 20:08:21 Starting command: terraform init -input=false -lock=false -no-color
2020/11/29 20:08:21 Terraform init | Initializing modules...
2020/11/29 20:08:21 Terraform init | Downloading github.com/ibm-garage-cloud/terraform-ibm-container-platform.git?ref=v1.18.0 for dev_cluster...
2020/11/29 20:08:21 Terraform init | - dev_cluster in .terraform/modules/dev_cluster
2020/11/29 20:08:22 Terraform init |
2020/11/29 20:08:22 Terraform init | Initializing the backend...
2020/11/29 20:08:22 Terraform init |
2020/11/29 20:08:22 Terraform init | Initializing provider plugins...
2020/11/29 20:08:22 Terraform init | - Checking for available provider plugins...
2020/11/29 20:08:22 Terraform init | - Downloading plugin for provider "helm" (hashicorp/helm) 1.3.2...
2020/11/29 20:08:25 Terraform init |
2020/11/29 20:08:25 Terraform init | The following providers do not have any version constraints in configuration,
2020/11/29 20:08:25 Terraform init | so the latest version was installed.
2020/11/29 20:08:25 Terraform init |
2020/11/29 20:08:25 Terraform init | To prevent automatic upgrades to new major versions that may contain breaking
2020/11/29 20:08:25 Terraform init | changes, it is recommended to add version = "..." constraints to the
2020/11/29 20:08:25 Terraform init | corresponding provider blocks in configuration, with the constraint strings
2020/11/29 20:08:25 Terraform init | suggested below.
2020/11/29 20:08:25 Terraform init |
2020/11/29 20:08:25 Terraform init | * provider.local: version = "> 2.0"
2020/11/29 20:08:25 Terraform init | * provider.null: version = "
> 3.0"
2020/11/29 20:08:25 Terraform init |
2020/11/29 20:08:25 Terraform init | Terraform has been successfully initialized!
2020/11/29 20:08:25 Command finished successfully.

2020/11/29 20:08:25 ----- Terraform APPLY -----
2020/11/29 20:08:25 Starting command: terraform apply -state=terraform.tfstate -var-file=schematics.tfvars -auto-approve -no-color -lock=false -parallelism=10
2020/11/29 20:08:37 Terraform apply | module.dev_cluster.data.ibm_resource_group.resource_group: Refreshing state...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login: Creating...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login: Provisioning with 'local-exec'...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.create_dirs: Creating...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.setup-chart: Creating...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.setup-chart: Provisioning with 'local-exec'...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.create_dirs: Provisioning with 'local-exec'...
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.setup-chart (local-exec): Executing: ["/bin/sh" "-c" "mkdir -p /tmp/tfws-499801141/cluster-classic/gitops/cloud-setup && cp -R .terraform/modules/dev_cluster/chart/cloud-setup/* /tmp/tfws-499801141/cluster-classic/gitops/cloud-setup"]
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.create_dirs (local-exec): Executing: ["/bin/sh" "-c" "mkdir -p /tmp/tfws-499801141/cluster-classic/.tmp"]
2020/11/29 20:08:46 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login (local-exec): Executing: ["/bin/sh" "-c" "ibmcloud login -r us-south -g cloudnativetoolkit --apikey ${APIKEY} > /dev/null"]
2020/11/29 20:08:47 Terraform apply | module.dev_cluster.null_resource.create_dirs: Provisioning with 'local-exec'...
2020/11/29 20:08:47 Terraform apply | module.dev_cluster.null_resource.create_dirs (local-exec): Executing: ["/bin/sh" "-c" "mkdir -p /tmp/tfws-499801141/cluster-classic/.kube"]
2020/11/29 20:08:47 Terraform apply | module.dev_cluster.null_resource.setup-chart: Creation complete after 0s [id=7898835537000230033]
2020/11/29 20:08:47 Terraform apply | module.dev_cluster.null_resource.create_dirs: Creation complete after 0s [id=1249128077975019190]
2020/11/29 20:08:49 Terraform apply | module.dev_cluster.data.ibm_container_cluster_versions.cluster_versions: Refreshing state...
2020/11/29 20:08:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Creating...
2020/11/29 20:08:56 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login: Still creating... [10s elapsed]
2020/11/29 20:08:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10s elapsed]
2020/11/29 20:09:06 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login: Still creating... [20s elapsed]
2020/11/29 20:09:08 Terraform apply | module.dev_cluster.null_resource.ibmcloud_login: Creation complete after 22s [id=2031535835236068711]
2020/11/29 20:09:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20s elapsed]
2020/11/29 20:09:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [30s elapsed]
2020/11/29 20:09:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [40s elapsed]
2020/11/29 20:09:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [50s elapsed]
2020/11/29 20:09:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m0s elapsed]
2020/11/29 20:09:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m10s elapsed]
2020/11/29 20:10:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m20s elapsed]
2020/11/29 20:10:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m30s elapsed]
2020/11/29 20:10:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m40s elapsed]
2020/11/29 20:10:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [1m50s elapsed]
2020/11/29 20:10:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m0s elapsed]
2020/11/29 20:10:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m10s elapsed]
2020/11/29 20:11:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m20s elapsed]
2020/11/29 20:11:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m30s elapsed]
2020/11/29 20:11:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m40s elapsed]
2020/11/29 20:11:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [2m50s elapsed]
2020/11/29 20:11:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m0s elapsed]
2020/11/29 20:11:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m10s elapsed]
2020/11/29 20:12:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m20s elapsed]
2020/11/29 20:12:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m30s elapsed]
2020/11/29 20:12:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m40s elapsed]
2020/11/29 20:12:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [3m50s elapsed]
2020/11/29 20:12:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m0s elapsed]
2020/11/29 20:12:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m10s elapsed]
2020/11/29 20:13:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m20s elapsed]
2020/11/29 20:13:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m30s elapsed]
2020/11/29 20:13:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m40s elapsed]
2020/11/29 20:13:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [4m50s elapsed]
2020/11/29 20:13:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m0s elapsed]
2020/11/29 20:13:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m10s elapsed]
2020/11/29 20:14:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m20s elapsed]
2020/11/29 20:14:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m30s elapsed]
2020/11/29 20:14:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m40s elapsed]
2020/11/29 20:14:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [5m50s elapsed]
2020/11/29 20:14:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m0s elapsed]
2020/11/29 20:14:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m10s elapsed]
2020/11/29 20:15:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m20s elapsed]
2020/11/29 20:15:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m30s elapsed]
2020/11/29 20:15:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m40s elapsed]
2020/11/29 20:15:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [6m50s elapsed]
2020/11/29 20:15:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m0s elapsed]
2020/11/29 20:15:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m10s elapsed]
2020/11/29 20:16:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m20s elapsed]
2020/11/29 20:16:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m30s elapsed]
2020/11/29 20:16:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m40s elapsed]
2020/11/29 20:16:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [7m50s elapsed]
2020/11/29 20:16:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m0s elapsed]
2020/11/29 20:16:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m10s elapsed]
2020/11/29 20:17:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m20s elapsed]
2020/11/29 20:17:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m30s elapsed]
2020/11/29 20:17:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m40s elapsed]
2020/11/29 20:17:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [8m50s elapsed]
2020/11/29 20:17:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m0s elapsed]
2020/11/29 20:17:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m10s elapsed]
2020/11/29 20:18:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m20s elapsed]
2020/11/29 20:18:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m30s elapsed]
2020/11/29 20:18:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m40s elapsed]
2020/11/29 20:18:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [9m50s elapsed]
2020/11/29 20:18:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m0s elapsed]
2020/11/29 20:18:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m10s elapsed]
2020/11/29 20:19:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m20s elapsed]
2020/11/29 20:19:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m30s elapsed]
2020/11/29 20:19:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m40s elapsed]
2020/11/29 20:19:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [10m50s elapsed]
2020/11/29 20:19:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m0s elapsed]
2020/11/29 20:19:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m10s elapsed]
2020/11/29 20:20:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m20s elapsed]
2020/11/29 20:20:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m30s elapsed]
2020/11/29 20:20:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m40s elapsed]
2020/11/29 20:20:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [11m50s elapsed]
2020/11/29 20:20:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m0s elapsed]
2020/11/29 20:20:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m10s elapsed]
2020/11/29 20:21:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m20s elapsed]
2020/11/29 20:21:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m30s elapsed]
2020/11/29 20:21:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m40s elapsed]
2020/11/29 20:21:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [12m50s elapsed]
2020/11/29 20:21:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m0s elapsed]
2020/11/29 20:21:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m10s elapsed]
2020/11/29 20:22:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m20s elapsed]
2020/11/29 20:22:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m30s elapsed]
2020/11/29 20:22:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m40s elapsed]
2020/11/29 20:22:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [13m50s elapsed]
2020/11/29 20:22:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m0s elapsed]
2020/11/29 20:22:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m10s elapsed]
2020/11/29 20:23:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m20s elapsed]
2020/11/29 20:23:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m30s elapsed]
2020/11/29 20:23:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m40s elapsed]
2020/11/29 20:23:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [14m50s elapsed]
2020/11/29 20:23:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m0s elapsed]
2020/11/29 20:23:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m10s elapsed]
2020/11/29 20:24:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m20s elapsed]
2020/11/29 20:24:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m30s elapsed]
2020/11/29 20:24:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m40s elapsed]
2020/11/29 20:24:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [15m50s elapsed]
2020/11/29 20:24:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m0s elapsed]
2020/11/29 20:24:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m10s elapsed]
2020/11/29 20:25:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m20s elapsed]
2020/11/29 20:25:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m30s elapsed]
2020/11/29 20:25:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m40s elapsed]
2020/11/29 20:25:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [16m50s elapsed]
2020/11/29 20:25:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m0s elapsed]
2020/11/29 20:25:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m10s elapsed]
2020/11/29 20:26:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m20s elapsed]
2020/11/29 20:26:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m30s elapsed]
2020/11/29 20:26:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m40s elapsed]
2020/11/29 20:26:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [17m50s elapsed]
2020/11/29 20:26:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m0s elapsed]
2020/11/29 20:26:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m10s elapsed]
2020/11/29 20:27:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m20s elapsed]
2020/11/29 20:27:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m30s elapsed]
2020/11/29 20:27:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m40s elapsed]
2020/11/29 20:27:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [18m50s elapsed]
2020/11/29 20:27:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m0s elapsed]
2020/11/29 20:27:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m10s elapsed]
2020/11/29 20:28:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m20s elapsed]
2020/11/29 20:28:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m30s elapsed]
2020/11/29 20:28:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m40s elapsed]
2020/11/29 20:28:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [19m50s elapsed]
2020/11/29 20:28:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m0s elapsed]
2020/11/29 20:28:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m10s elapsed]
2020/11/29 20:29:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m20s elapsed]
2020/11/29 20:29:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m30s elapsed]
2020/11/29 20:29:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m40s elapsed]
2020/11/29 20:29:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [20m50s elapsed]
2020/11/29 20:29:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m0s elapsed]
2020/11/29 20:29:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m10s elapsed]
2020/11/29 20:30:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m20s elapsed]
2020/11/29 20:30:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m30s elapsed]
2020/11/29 20:30:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m40s elapsed]
2020/11/29 20:30:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [21m50s elapsed]
2020/11/29 20:30:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m0s elapsed]
2020/11/29 20:30:59 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m10s elapsed]
2020/11/29 20:31:09 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m20s elapsed]
2020/11/29 20:31:19 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m30s elapsed]
2020/11/29 20:31:29 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m40s elapsed]
2020/11/29 20:31:39 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [22m50s elapsed]
2020/11/29 20:31:49 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Still creating... [23m0s elapsed]
2020/11/29 20:31:50 Terraform apply | module.dev_cluster.ibm_container_cluster.cluster[0]: Creation complete after 23m1s [id=bv1vvmgd0lg4nntvqfc0]
2020/11/29 20:31:50 Terraform apply | module.dev_cluster.data.ibm_container_cluster_config.cluster: Refreshing state...
2020/11/29 20:31:50 Terraform apply | module.dev_cluster.data.ibm_container_cluster.config[0]: Refreshing state...
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config: Creating...
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config: Provisioning with 'local-exec'...
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config (local-exec): Executing: ["/bin/sh" "-c" "rm -f /tmp/tfws-499801141/cluster-classic/.kube/config && ln -s /tmp/tfws-499801141/cluster-classic/.kube/6688f53ac7ad66c2640c2c65b925f752fea5f71a_cnt-cp4int-2_k8sconfig/config.yml /tmp/tfws-499801141/cluster-classic/.kube/config"]
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config: Provisioning with 'local-exec'...
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config (local-exec): Executing: ["/bin/sh" "-c" "cp /tmp/tfws-499801141/cluster-classic/.kube/6688f53ac7ad66c2640c2c65b925f752fea5f71a_cnt-cp4int-2_k8sconfig/* /tmp/tfws-499801141/cluster-classic/.kube"]
2020/11/29 20:32:02 Terraform apply | module.dev_cluster.null_resource.setup_kube_config: Creation complete after 0s [id=3362779350908468821]
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Creating...
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0]: Creating...
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0]: Provisioning with 'local-exec'...
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0] (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete consolelink toolkit-github --ignore-not-found"]
2020/11/29 20:32:03 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete secret -n default -l name=ibmcloud-config --ignore-not-found"]
2020/11/29 20:32:05 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0]: Provisioning with 'local-exec'...
2020/11/29 20:32:05 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0] (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete consolelink toolkit-registry --ignore-not-found"]
2020/11/29 20:32:05 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): No resources found
2020/11/29 20:32:05 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:05 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete secret -n default -l name=cloud-setup --ignore-not-found"]
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-consolelink[0]: Creation complete after 3s [id=4501311566761796380]
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): No resources found
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete secret -n default ibmcloud-apikey --ignore-not-found"]
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:06 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete configmap -n default ibmcloud-config --ignore-not-found"]
2020/11/29 20:32:07 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:07 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete secret -n default cloud-access --ignore-not-found"]
2020/11/29 20:32:07 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Provisioning with 'local-exec'...
2020/11/29 20:32:07 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config (local-exec): Executing: ["/bin/sh" "-c" "kubectl delete configmap -n default cloud-config --ignore-not-found"]
2020/11/29 20:32:08 Terraform apply | module.dev_cluster.null_resource.delete-helm-cloud-config: Creation complete after 5s [id=2256784502113175597]
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.local_file.cloud-values: Creating...
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.local_file.cloud-values: Creation complete after 1s [id=bd6b1b05f2d0ea5021d313115ba4880e1a909e73]
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values: Creating...
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values: Provisioning with 'local-exec'...
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): Executing: ["/bin/sh" "-c" "cat /tmp/tfws-499801141/cluster-classic/gitops/cloud-setup/values.yaml"]
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.helm_release.cloud_setup: Creating...
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "cloud-setup":
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "cntk-dev-guide":
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "displayName": "Cloud-Native Toolkit"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "name": "cntk-dev-guide"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "url": "https://cloudnativetoolkit.dev"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "first-app":
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "displayName": "Deploy first app"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "name": "first-app"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "url": "https://cloudnativetoolkit.dev/getting-started-day-1/deploy-app/"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "ibmcloud":
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "apikey": "xxxhiddenxxx"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "cluster_name": "cnt-cp4int-2"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "cluster_type": "openshift"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "cluster_version": "4.4.29_openshift"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "ingress_subdomain": "cnt-cp4int-2-d397fcc2ee1796ae5b779b59baaa6ea4-0000.us-south.containers.appdomain.cloud"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "region": "us-south"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "resource_group": "cloudnativetoolkit"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "server_url": "https://c116-e.us-south.containers.cloud.ibm.com:32208"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "tls_secret_name": "cnt-cp4int-2-d397fcc2ee1796ae5b779b59baaa6ea4-0000"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "global":
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "clusterType": "ocp4"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "ingressSubdomain": "cnt-cp4int-2-d397fcc2ee1796ae5b779b59baaa6ea4-0000.us-south.containers.appdomain.cloud"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values (local-exec): "tlsSecretName": "cnt-cp4int-2-d397fcc2ee1796ae5b779b59baaa6ea4-0000"
2020/11/29 20:32:19 Terraform apply | module.dev_cluster.null_resource.print-values: Creation complete after 0s [id=8259230916646854547]
2020/11/29 20:32:22 Terraform apply | module.dev_cluster.helm_release.cloud_setup: Creation complete after 3s [id=cloud-setup]
2020/11/29 20:32:22 Terraform apply |
2020/11/29 20:32:22 Terraform apply | Warning: "cluster_name_id": [DEPRECATED] use name instead
2020/11/29 20:32:22 Terraform apply |
2020/11/29 20:32:22 Terraform apply | on .terraform/modules/dev_cluster/main-1-classic.tf line 17, in data "ibm_container_cluster" "config":
2020/11/29 20:32:22 Terraform apply | 17: data "ibm_container_cluster" "config" {
2020/11/29 20:32:22 Terraform apply |
2020/11/29 20:32:22 Terraform apply | (and one more similar warning elsewhere)
2020/11/29 20:32:22 Terraform apply |
2020/11/29 20:32:22 Terraform apply |
2020/11/29 20:32:22 Terraform apply | Apply complete! Resources: 10 added, 0 changed, 0 destroyed.
2020/11/29 20:32:22 Command finished successfully.
2020/11/29 20:32:26 Done with the workspace action

Detect + avoid problems with us.icr.io registry namespace collisions

As I understand it, when installing onto the OCP service on IBM Cloud:

The installation terraform uses the resource group name that is specified in settings/environment.tfvars, as the value of the namespace to be created in the multi-tenant registry at us.icr.io. There are two problems with this strategy, which both impact the consumability of the asset.

1.) Since the registry in this scenario is multi-tenant, the namespace (== the resource group name) must be unique within the region. I do not see guidance in our docs of how to verify that the Admin/Dev can verify that a selected resource group name / namespace value is unique. Also, In fact, I could not find any guidance abount resource group name constraints when I searched just now.

2.) If the naive user simply uses the "default" resource group for his install, the install will break (i did this 1st time thru, but didn't record the error).

In both cases, the errors leave at least a few resources laying around which are a little trouble to clean up. The root cause of the failed install may or may not be apparent to the user.

Can the automation be augmented, so that it creates and verifies the needed unique name, without having to educate the user on this obsure point?

Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.