GithubHelp home page GithubHelp logo

spotify / scio Goto Github PK

View Code? Open in Web Editor NEW
2.5K 115.0 506.0 61.94 MB

A Scala API for Apache Beam and Google Cloud Dataflow.

Home Page: https://spotify.github.io/scio

License: Apache License 2.0

Java 25.68% Scala 73.76% Python 0.41% Shell 0.09% C++ 0.05% StringTemplate 0.01%
scala bigquery google-cloud beam dataflow batch streaming data ml scio

scio's Introduction

Scio

Continuous Integration codecov.io GitHub license Maven Central Scaladoc Scala Steward badge

Scio Logo

Ecclesiastical Latin IPA: /ˈʃi.o/, [ˈʃiː.o], [ˈʃi.i̯o] Verb: I can, know, understand, have knowledge.

Scio is a Scala API for Apache Beam and Google Cloud Dataflow inspired by Apache Spark and Scalding.

Scio 0.3.0 and future versions depend on Apache Beam (org.apache.beam) while earlier versions depend on Google Cloud Dataflow SDK (com.google.cloud.dataflow). See this page for a list of breaking changes.

Features

  • Scala API close to that of Spark and Scalding core APIs
  • Unified batch and streaming programming model
  • Fully managed service*
  • Integration with Google Cloud products: Cloud Storage, BigQuery, Pub/Sub, Datastore, Bigtable
  • JDBC, TensorFlow TFRecords, Cassandra, Elasticsearch and Parquet I/O
  • Interactive mode with Scio REPL
  • Type safe BigQuery
  • Integration with Algebird and Breeze
  • Pipeline orchestration with Scala Futures
  • Distributed cache

* provided by Google Cloud Dataflow

Quick Start

Download and install the Java Development Kit (JDK) version 8.

Install sbt.

Use our giter8 template to quickly create a new Scio job repository:

sbt new spotify/scio.g8

Switch to the new repo (default scio-job) and build it:

cd scio-job
sbt stage

Run the included word count example:

target/universal/stage/bin/scio-job --output=wc

List result files and inspect content:

ls -l wc
cat wc/part-00000-of-00004.txt

Documentation

Getting Started is the best place to start with Scio. If you are new to Apache Beam and distributed data processing, check out the Beam Programming Guide first for a detailed explanation of the Beam programming model and concepts. If you have experience with other Scala data processing libraries, check out this comparison between Scio, Scalding and Spark.

Example Scio pipelines and tests can be found under scio-examples. A lot of them are direct ports from Beam's Java examples. See this page for some of them with side-by-side explanation. Also see Big Data Rosetta Code for common data processing code snippets in Scio, Scalding and Spark.

Artifacts

Scio includes the following artifacts:

  • scio-avro: add-on for Avro, can also be used standalone
  • scio-cassandra*: add-ons for Cassandra
  • scio-core: core library
  • scio-elasticsearch*: add-ons for Elasticsearch
  • scio-extra: extra utilities for working with collections, Breeze, etc., best effort support
  • scio-google-cloud-platform: add-on for Google Cloud IO's: BigQuery, Bigtable, Pub/Sub, Datastore, Spanner
  • scio-grpc: add-on for gRPC service calls
  • scio-jdbc: add-on for JDBC IO
  • scio-neo4j: add-on for Neo4J IO
  • scio-parquet: add-on for Parquet
  • scio-redis: add-on for Redis
  • scio-repl: extension of the Scala REPL with Scio specific operations
  • scio-smb: add-on for Sort Merge Bucket operations
  • scio-tensorflow: add-on for TensorFlow TFRecords IO and prediction
  • scio-test: test utilities, add to your project as a "test" dependency

License

Copyright 2021 Spotify AB.

Licensed under the Apache License, Version 2.0: http://www.apache.org/licenses/LICENSE-2.0

scio's People

Contributors

andrewsmartin avatar andrisnoko avatar anish749 avatar benfradet avatar clairemcginty avatar dependabot[bot] avatar elpicador avatar fallonchen avatar farzad-sedghi avatar i-maravic avatar jbigred1 avatar jto avatar kanterov avatar kellen avatar martinbomio avatar mrkm4ntr avatar nevillelyh avatar psobot avatar ravwojdyla avatar regadas avatar rustedbones avatar samschlegel avatar scala-steward avatar shnapz avatar sisidra avatar spkrka avatar spotify-steward[bot] avatar stormy-ua avatar syodage avatar yonromai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scio's Issues

Simple dependency primitives

Some utilities for waiting on or checking external dependencies. Should return either Option[Tap[T]] or Future[Tap[T]], maybe like:

Resources.getAvroFile(): Option[Tap[T]]
Resources.waitForAvroFile(): Future[Tap[T]]

Scio REPL should exclude JDK jars from filesToStage

scio> __scio__df__opts__.getFilesToStage.asScala.foreach(println)
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/resources.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/rt.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/jsse.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/jce.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/charsets.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/jfr.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/cldrdata.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/dnsns.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/jfxrt.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/localedata.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/nashorn.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/sunec.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar
/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/ext/zipfs.jar
/System/Library/Java/Extensions/MRJToolkit.jar
/Users/neville/src/gcp/scio/scio-repl/target/scala-2.11/scio-repl-0.1.2-SNAPSHOT-fat.jar
/Users/neville/src/gcp/scio/.
/var/folders/hd/f_wgpmsj3h1cv071xszfmyzh0000gn/T/1457400626064-0/scio-repl-session.jar

Handle repl arguments gracefully

Currently an invalid argument to repl would break scio context creation. Handle invalid and extra arguments gracefully in repl.

BigQueryType throws ServiceConfigurationError in console

This used to work.

scala> import com.spotify.scio.bigquery.types.BigQueryType
import com.spotify.scio.bigquery.types.BigQueryType

scala> @BigQueryType.fromTable("di-bigquery-data-commons:track_entity.track_entity_20160224") class TE
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
<console>:11: error: exception during macro expansion:
java.util.ServiceConfigurationError: com.google.cloud.dataflow.sdk.runners.PipelineRunnerRegistrar: Provider com.google.cloud.dataflow.sdk.runners.DataflowPipelineRegistrar$Runner not a subtype
        at java.util.ServiceLoader.fail(ServiceLoader.java:231)
        at java.util.ServiceLoader.access$300(ServiceLoader.java:181)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:369)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
        at com.google.cloud.dataflow.sdk.repackaged.com.google.common.collect.Iterators.addAll(Iterators.java:362)
        at com.google.cloud.dataflow.sdk.repackaged.com.google.common.collect.Lists.newArrayList(Lists.java:160)
        at com.google.cloud.dataflow.sdk.repackaged.com.google.common.collect.Lists.newArrayList(Lists.java:144)
        at com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory.<clinit>(PipelineOptionsFactory.java:510)
        at com.spotify.scio.bigquery.BigQueryClient$.apply(BigQueryClient.scala:384)
        at com.spotify.scio.bigquery.BigQueryClient$.apply(BigQueryClient.scala:377)
        at com.spotify.scio.bigquery.types.TypeProvider$.bigquery$lzycompute(TypeProvider.scala:32)
        at com.spotify.scio.bigquery.types.TypeProvider$.bigquery(TypeProvider.scala:32)
        at com.spotify.scio.bigquery.types.TypeProvider$.tableImpl(TypeProvider.scala:41)

       @BigQueryType.fromTable("di-bigquery-data-commons:track_entity.track_entity_20160224") class TE
        ^

SCollection name for parallelize

parallelize* sets name of SCollection to elems.toString. Is it a problem? What if elems is big or elem in elems is big. Will dataflow accept it?

StackOverflowError when reading from BigQuery with SDK 1.5.0

Could be related to the change to import/export BQ as Avro introduced in 1.5.0.

java.lang.StackOverflowError
at org.apache.avro.io.parsing.Symbol$Sequence.flattenedSize(Symbol.java:323)
at org.apache.avro.io.parsing.Symbol.flattenedSize(Symbol.java:216)
at org.apache.avro.io.parsing.Symbol$Sequence.flattenedSize(Symbol.java:323)
at org.apache.avro.io.parsing.Symbol.flattenedSize(Symbol.java:216)
at org.apache.avro.io.parsing.Symbol$Sequence.flattenedSize(Symbol.java:323)
at org.apache.avro.io.parsing.Symbol.flattenedSize(Symbol.java:216)

SDK 1.5.0 exports BQ as Avro instead of JSON:
GoogleCloudPlatform/DataflowJavaSDK@4dce3c2

Related:
http://stackoverflow.com/questions/24130615/circular-references-not-handled-in-avro
https://code.google.com/p/google-bigquery/issues/detail?id=381
GoogleCloudDataproc/hadoop-connectors#16
GoogleCloudPlatform/DataflowJavaSDK#152

sampleByKey tests fail in Travis

[info] - should support sampleByKey()
Exception in thread "Thread-10" Exception in thread "Thread-14" java.io.EOFException
    at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2598)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1318)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at org.scalatest.tools.Framework$ScalaTestRunner$Skeleton$1$React.react(Framework.scala:795)
    at org.scalatest.tools.Framework$ScalaTestRunner$Skeleton$1.run(Framework.scala:784)
    at java.lang.Thread.run(Thread.java:745)
java.io.EOFException
    at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2598)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1318)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at sbt.React.react(ForkTests.scala:114)
    at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:74)
    at java.lang.Thread.run(Thread.java:745)
[info] Run completed in 23 seconds, 252 milliseconds.
[info] Total number of tests run: 124
[info] Suites: completed 6, aborted 0
[info] Tests: succeeded 123, failed 1, canceled 0, ignored 0, pending 0
[info] *** 1 TEST FAILED ***

no native library is found for os.name=Mac and os.arch=x86_64

scio> sc.textFile("README.md")
org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] no native library is found for os.name=Mac and os.arch=x86_64
  at org.xerial.snappy.SnappyLoader.findNativeLibrary(SnappyLoader.java:331)
  at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:171)
  at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:152)
  at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
  at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:97)
  at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:89)
  at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
  at com.google.cloud.dataflow.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:49)
  at com.google.cloud.dataflow.sdk.util.SerializableUtils.ensureSerializable(SerializableUtils.java:84)
  at com.google.cloud.dataflow.sdk.io.Read$Bounded.<init>(Read.java:107)
  at com.google.cloud.dataflow.sdk.io.Read$Bounded.<init>(Read.java:102)
  at com.google.cloud.dataflow.sdk.io.Read.from(Read.java:61)
  at com.google.cloud.dataflow.sdk.io.TextIO$Read$Bound.apply(TextIO.java:318)
  at com.google.cloud.dataflow.sdk.io.TextIO$Read$Bound.apply(TextIO.java:203)
  at com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
  at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.apply(DirectPipelineRunner.java:247)
  at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:367)
  at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:290)
  at com.google.cloud.dataflow.sdk.values.PBegin.apply(PBegin.java:58)
  at com.google.cloud.dataflow.sdk.Pipeline.apply(Pipeline.java:171)
  at com.spotify.scio.ScioContext.applyInternal(ScioContext.scala:290)
  at com.spotify.scio.ScioContext$$anonfun$textFile$1.apply(ScioContext.scala:425)
  at com.spotify.scio.ScioContext$$anonfun$textFile$1.apply(ScioContext.scala:422)
  at com.spotify.scio.ScioContext.pipelineOp(ScioContext.scala:248)
  at com.spotify.scio.ScioContext.textFile(ScioContext.scala:421)
  ... 32 elided

seems to be coming from recent change (37dec8f) in dataflow sdk.

Seems like an issue in xerial/snappy-java

AvroHadoopFileSource does not allow specifying a reader schema

By default AvroKeyInputFormat uses the writer schema as the reader schema. AvroHadoopFileSource uses this default behavior.
While this is ok when reading a set of Avro files which are all written using the same schema, it'll break down when reading a set of files that do not share the exact same schema.

org.apache.hadoop.fs.FileSystem conflict in assembly jars

2 versions of org.apache.hadoop.fs.FileSystem may be included in an assembly jar that depends on scio-hdfs, one from hadoop-common and one from hadoop-hdfs. We should fix it so that end users don't have to add strange merge strategy. Or at least document it.

diff hadoop-common/META-INF/services/org.apache.hadoop.fs.FileSystem hadoop-hdfs/META-INF/services/org.apache.hadoop.fs.FileSystem
16,19c16,20
< org.apache.hadoop.fs.LocalFileSystem
< org.apache.hadoop.fs.viewfs.ViewFileSystem
< org.apache.hadoop.fs.ftp.FTPFileSystem
< org.apache.hadoop.fs.HarFileSystem

---
> org.apache.hadoop.hdfs.DistributedFileSystem
> org.apache.hadoop.hdfs.web.HftpFileSystem
> org.apache.hadoop.hdfs.web.HsftpFileSystem
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem
> org.apache.hadoop.hdfs.web.SWebHdfsFileSystem

Nulls from closure

We use chill's ClosureCleaner to clean up lambdas for serialization. There are some edge cases where vals from a closure end up being null.

Unable to deserialize FN //serialVersionUIDs

Got this error a couple of times in the row, setting serialVersionUIDs helps.

java.lang.IllegalArgumentException: unable to deserialize serialized fn info at com.google.cloud.dataflow.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:76) at com.google.cloud.dataflow.sdk.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:61) at com.google.cloud.dataflow.sdk.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70) at com.google.cloud.dataflow.sdk.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:219) at com.google.cloud.dataflow.sdk.runners.worker.MapTaskExecutorFactory.createOperation(MapTaskExecutorFactory.java:133) at com.google.cloud.dataflow.sdk.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:85) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:176) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:149) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:192) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:173) at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:160) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.InvalidClassException: com.spotify.scio.values.SCollection$$anon$3; local class incompatible: stream classdesc serialVersionUID = 1458661216598417823, local class serialVersionUID = -3081382441803323297 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:621) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) at com.google.cloud.dataflow.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:73) ... 14 more

From Serializable doc:

If a serializable class does not explicitly declare a serialVersionUID, then the serialization runtime will calculate a default serialVersionUID value for that class based on various aspects of the class, as described in the Java(TM) Object Serialization Specification. However, it is strongly recommended that all serializable classes explicitly declare serialVersionUID values, since the default serialVersionUID computation is highly sensitive to class details that may vary depending on compiler implementations, and can thus result in unexpected InvalidClassExceptions during deserialization. Therefore, to guarantee a consistent serialVersionUID value across different java compiler implementations, a serializable class must declare an explicit serialVersionUID value. It is also strongly advised that explicit serialVersionUID declarations use the private modifier where possible, since such declarations apply only to the immediately declaring class--serialVersionUID fields are not useful as inherited members. Array classes cannot declare an explicit serialVersionUID, so they always have the default computed value, but the requirement for matching serialVersionUID values is waived for array classes.

We may have to add serialVersionUID or @SerialVersionUID in a bunch of places.

Pretty printing BQ schema does not show nullable fields within repeated record as option type

If we have nullable fields within a repeated record, when pretty printing the schema they are not displayed as option types.

Example:

Field Type Mode
strlist RECORD REPEATED
strlist.str STRING NULLABLE

REPL session:

scio> @BigQueryType.fromSchema(
     |     """
     |       |{
     |       |   "fields": [
     |       |     {"mode": "REPEATED", "name": "strlist", "type": "RECORD", "fields": [
     |       |         {"mode": "NULLABLE", "name": "str", "type": "STRING"}
     |       |     ]}
     |       |   ]
     |       |}
     |     """.stripMargin
     |   )
     |   class Row
defined class Row
defined object Row
defined class Strlist$

scio> Row.toPrettyString(2)
res1: String =
(
  strlist: List[(
    str: String)])

scio>

The expected output:

scio> Row.toPrettyString(2)
res1: String =
(
  strlist: List[(
    str: Option[String])])

scio>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.