GithubHelp home page GithubHelp logo

zf0x00 / cord-preview Goto Github PK

View Code? Open in Web Editor NEW

This project forked from getcord/cord-preview

0.0 0.0 0.0 26.54 MB

License: Apache License 2.0

Shell 0.77% JavaScript 8.29% Python 0.34% Java 0.37% Go 0.20% TypeScript 82.43% CSS 0.80% HTML 6.02% PLpgSQL 0.67% Dockerfile 0.10%

cord-preview's Introduction

The Cord Monorepo

This is the code and tools needed to run the Cord service.

Running Locally

Use these instructions to run Cord locally, using only local resources (DB, Redis, etc).

Note: Cord has only routinely been run on MacOS and Linux distributions using apt. Nothing should be platform-specific, but instructions for other platforms are left as an exercise for the reader.

Getting Set Up

Dependencies

Before running Cord, you need to install some software.

  • Install Node and NPM
  • Install Docker (Mac, Linux)
  • [Mac-only] Install Homebrew
  • Install jq (Mac: brew install jq, Linux: apt install jq)
  • Postgres command line tools (Mac: brew install libpq && brew link --force libpq, Linux: apt install postgresql-client)

Local Certificates

To connect over TLS to our local machine, we need to install a self-signed certificate.

Mac:

  • Run scripts/generate-localhost-certificates.sh (which will use Homebrew to install mkcert)
  • If you're using Firefox, set security.enterprise_roots.enabled to true in about:config

Linux:

  • Install mkcert via apt install mkcert
  • Run scripts/generate-localhost-certificates.sh

Configuration

Run scripts/generate-dotenv.cjs --include-secrets=false to generate a .env file that contains configuration options for running the dev server.

Running

Run npm run local-dev to start the local development environment.

Local endpoints

Migrating from the Cord Platform

There are two steps to migrating your data from the Cord platform to your own self-hosted infrastructure: migrating the database data and migrating S3 data (such as message attachments).

In both cases, you need a project management auth token. This should be provided to the following APIs in an Authorization header.

S3 Data

To copy your files from S3, first you need to configure an S3 bucket as described in steps 1 and 2 of the documentation for configuring a custom S3 bucket. Then create a policy that allows at least the PutObject and ListObjects permissions to arn:aws:iam::869934154475:role/radical-stack-prodServerLondonASGInstanceRole31491-P9EJBVI9CBCR (our production server user) on both the bucket and every object in the bucket. The policy should look something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::869934154475:role/radical-stack-prodServerLondonASGInstanceRole31491-P9EJBVI9CBCR"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::YOUR-S3-BUCKET-ID",
                "arn:aws:s3:::YOUR-S3-BUCKET-ID/*"
            ]
        }
    ]
}

Once that's done, contact someone at Cord, as we need also need to configure an IAM policy in our account to approve that same access.

After setting up the permissions, call https://api.cord.com/v1/customer/copyfiles?region=YOUR-S3-REGION&bucket=YOUR-S3-BUCKET-ID. The handler is an incremental copy that takes a limit parameter from 1 to 1000 (default 10) and attempts to copy that many files into your bucket, so you will likely have to run it more than once. Keep running it until it returns {"copied":0}, at which point all files are copied.

You can do this step at any point, and because it's incremental, you can run it before you're ready to switch to your own infrastructure and then run it again at the point of switchover to just copy over any new files that have been uploaded since.

Database Migration

To migrate your database data, call https://api.cord.com/v1/customer/dbdump. This will product a SQL script that contains all of your data, ready to be run against an empty database via psql. This will include all data for all of your projects. Be patient, it may take up to a minute or two to collect all of the data.

This data is obviously only valid as of the time the command is run, so you likely will want to use it to test out your migration process, then run it again right before switching to your own infrastructure so as to capture the most up-to-date data available.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.