GithubHelp home page GithubHelp logo

workers's Introduction

Workers

Cloudflare workers ⚡

Installation

Bun is used for deps, tests, and scripts; Wrangler for development and deployment.

# install bun
export PATH="${HOME}/.bun/bin:${PATH}"
curl -fsSL https://bun.sh/install | bash

# install dependencies
bun i

# configure environment (only if you want to deploy)
export CLOUDFLARE_ACCOUNT_ID=...
export CLOUDFLARE_API_TOKEN=...

Usage

Environment variables

Create a .dev.vars file in the worker's folder. These set secrets on the env object during development. For example, the huggingface worker would require this in huggingface/.dev.vars:

HF_TOKEN=hf_...

See types.ts for more. Use wrangler secret {list,put,delete} --name=... to manage secrets for a deployed worker.

Scripts

Each worker gets its own start and deploy script:

# run the huggingface worker locally
bun start:hf

# deploy the huggingface worker to hf.you.workers.dev
bun deploy:hf

Workers

Wrapper around the 🤗 Inference API.

GET /

Supports query params with task presets and default models for convenience.

curl \
  -G \
  -d 'model=google/flan-t5-base&inputs=Translate+to+French:+I+love+Hugging+Face!' \
  https://localhost:8787

curl \
  -G \
  -d 'task=text-to-image&inputs=watercolor+painting+marina+sunset&negative_prompt=birds' \
  https://localhost:8787

POST /

Same as above but with JSON

curl \
  -d '{ "task": "text-to-speech", "inputs": "I love Hugging Face!" }' \
  -o speech.flac \
  https://localhost:8787

POST /chat/completions

OpenAI-compatible chat format. Model defaults to huggingfaceh4/zephyr-7b-beta. See TGI Messages API.

curl \
  -d '{ "messages": [{ "role": "system", "content": "Be precise and concise." }, { "role": "user", "content": "How many stars are in our galaxy?" }] }' \
  https://localhost:8787/chat/completions
import OpenAI from 'openai'

const openai = new OpenAI({
  apiKey: '', // pass empty string
  baseURL: 'http://localhost:8787/chat/completions'
})

const stream = await openai.chat.completions.create({
  stream: true,
  model: 'huggingfaceh4/zephyr-7b-beta',
  messages: [
    { role: 'system', content: 'Be precise and concise.' },
    { role: 'user', content: 'How many stars are in our galaxy?' }
  ]
})

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || '')
}

Wrapper around the Perplexity.ai API.

GET /

Supports query params for convenience. Model defaults to llama-3-sonar-small-32k-chat and system prompt defaults to Be precise and concise..

curl \
  -G \
  -d 'prompt=How+many+stars+are+in+our+galaxy?' \
  https://localhost:8787

POST /

OpenAI-compatible chat format.

curl \
  -d '{ "messages": [{ "role": "system", "content": "Be precise and concise." }, { "role": "user", "content": "How many stars are in our galaxy?" }] }' \
  https://localhost:8787
import OpenAI from 'openai'

const openai = new OpenAI({
  apiKey: '', // pass empty string
  baseURL: 'http://localhost:8787/chat/completions'
})

const stream = await openai.chat.completions.create({
  stream: true,
  model: 'llama-3-sonar-large-32k-chat',
  messages: [
    { role: 'system', content: 'Be precise and concise.' },
    { role: 'user', content: 'How many stars are in our galaxy?' }
  ]
})

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || '')
}

Simple proxy for any URL. Sets CORS headers on the response. Accepts optional headers that can be added to the request or removed from the response.

curl http://localhost:8787/user/1?host=api.github.com

# with auth
curl "http://localhost:8787/user/1?host=api.github.com&headers=authorization=Bearer%20${GH_TOKEN}"

# with response headers removed
curl http://localhost:8787/user/1?host=api.github.com&headers=-x-frame-options,-content-security-policy

workers's People

Contributors

adamelliotfields avatar

Stargazers

 avatar  avatar

Watchers

 avatar

workers's Issues

Env middleware

In workers, environment variables are not globally available like process.env or import.meta.env. They are passed to the request handler, so you only have access to them within the scope of that handler.

Create a simple middleware that takes a list of strings and throws a 5xx error if any are not on the env object.

Examples

Create an examples folder. Durable Objects, Vectorize, Queues, WebSockets, KV, D1, etc.

TODO

  • Build and test with Bun.
  • GitHub Actions.

Fix CORS

Current configuration (#6) assumes the browser always includes the Origin header. It does on the OPTIONS preflight, but not on the actual request.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.