GithubHelp home page GithubHelp logo

conduktor / kafka-security-manager Goto Github PK

View Code? Open in Web Editor NEW
353.0 39.0 158.0 270 KB

Manage your Kafka ACL at scale

Home Page: https://hub.docker.com/r/simplesteph/kafka-security-manager

License: MIT License

Shell 0.61% Scala 99.39%
kafka acl ksm zookeeper docker acl-changes broker security

kafka-security-manager's People

Contributors

andriyfedorov avatar brunodomenici avatar devshawn avatar diablo2050 avatar gurinderu avatar infogulch avatar isaackuang avatar ivan-klass avatar ivanbmstu avatar j1king avatar jmcristobal avatar kentso avatar kgupta26 avatar michellewen1516 avatar ntrp avatar o0oxid avatar radishthehut avatar satish-centrifuge avatar sderosiaux avatar shayaantx avatar silverbadge avatar simplesteph avatar thealmightygrant avatar trobert avatar vultron81 avatar zhangzimou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-security-manager's Issues

Add a UI

PR most welcome.

Features

  • I'm thinking right now a read only API to view the list of ACLs in a searchable table.

Implementation

To make things fun (and always keep on learning):

Tasks

  • gRPC Endpoints
  • rest gateway
  • ui

running without Docker?

Could you please document the invocation without Docker?
Either my java -jar oder java -cp attempts are wrong or I'm having a Scala version problem...

Thanks!

Bitbucket Cloud config required despite not being used

Our KSM deployments use com.github.simplesteph.ksm.source.BitbucketServerSourceAcl as we have a private Bitbucket deployment.

However, now our KSM apps running on :latest are failing to startup due to:

Exception in thread "main" com.typesafe.config.ConfigException$UnresolvedSubstitution: application.conf @ jar:file:/opt/docker/lib/com.github.simplesteph.ksm.kafka-security-manager-0.9-SNAPSHOT.jar!/application.conf: 115: Could not resolve substitution to a value: ${SOURCE_BITBUCKET_CLOUD_AUTH_PASSWORD}

Seems this config is weirdly required even though we don't use the cloud source ACL type.

I guess due to https://github.com/simplesteph/kafka-security-manager/blob/master/src/main/resources/application.conf#L114-L115

All other ENV checks are optional other than these.

Unable to use basic auth

I get this exception when trying to use basic auth:

[2019-07-02 14:58:26,349] ERROR Refreshing the source failed (AclSynchronizer)
java.io.UnsupportedEncodingException: UTF ​-8
	at java.lang.StringCoding.encode(StringCoding.java:341)
	at java.lang.String.getBytes(String.java:918)
	at com.github.simplesteph.ksm.source.GitHubSourceAcl.$anonfun$refresh$1(GitHubSourceAcl.scala:58)
	at scala.Option.foreach(Option.scala:257)
	at com.github.simplesteph.ksm.source.GitHubSourceAcl.refresh(GitHubSourceAcl.scala:57)
	at com.github.simplesteph.ksm.AclSynchronizer.$anonfun$run$1(AclSynchronizer.scala:76)
	at scala.util.Try$.apply(Try.scala:209)
	at com.github.simplesteph.ksm.AclSynchronizer.run(AclSynchronizer.scala:76)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Note the space between "UTF" and "-8". I believe there are extraneous characters here:

https://github.com/simplesteph/kafka-security-manager/blob/15f802df04f0837275e67c349e39db5ea348256d/src/main/scala/com/github/simplesteph/ksm/source/GitHubSourceAcl.scala#L58

That line looks reasonable in GitHub, but when I open it in vi it looks like this:

val basicB64 = Base64.getEncoder.encodeToString(basic.getBytes("UTF<200c><200b>-8"))

I believe that string should just be "UTF-8".

Unable to use selfmade Callbackhandler Classes

I am running Kafka in a Kubernetes environment. I have implemented sasl callbackhandler for Kafka to login and verify users via Keycloak. Sadly I can't add the login handler to the ksm container, since there is no way to add the kafka property "sasl.login.callback.handler.class". Moreover I would have to extend the java classpath to add my jar file containing the callbackhandlers, which is also not possible at the moment, as far as I know. To enable these features would be awesome!

Could not login

This is file jaas.conf

Client {
    com.sun.security.auth.module.Krb5LoginModule required
    useKeyTab=true
    storeKey=true
    keyTab="/etc/kafka/secrets/zkclient1.keytab"
    principal="zkclient/[email protected]";
};

KafkaClient {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="admin"
  password="admin-secret";
};

And I get this error

[main] INFO org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=10.10.12.94:2181,10.10.12.95:2181,10.10.12.96:2181 sessionTimeout=40000 watcher=io.confluent.admin.utils.ZookeeperConnectionWatcher@30dae81
[main-SendThread(kafka-3:2181)] WARN org.apache.zookeeper.SaslClientCallbackHandler - Could not login: the Client is being asked for a password, but the ZooKeeper Client code does not currently support obtaining a password from the user. Make sure that the Client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)' and restart the Client. If you still get this message after that, the TGT in the ticket cache has expired and must be manually refreshed. To do so, first determine if you are using a password or a keytab. If the former, run kinit in a Unix shell in the environment of the user who is running this Zookeeper Client using the command 'kinit <princ>' (where <princ> is the name of the Client's Kerberos principal). If the latter, do 'kinit -k -t <keytab> <princ>' (where <princ> is the name of the Kerberos principal, and <keytab> is the location of the keytab file). After manually refreshing your cache, restart this Client. If you continue to see this message after manually refreshing your cache, ensure that your KDC host's clock is in sync with this host's clock.
[main-SendThread(kafka-3:2181)] WARN org.apache.zookeeper.ClientCnxn - SASL configuration failed: javax.security.auth.login.LoginException: No password provided Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it.

Error while reading file from S3 bucket

I am using com.github.simplesteph.ksm.source.S3SourceAcl source class to read CSV file from the bucket but I am getting the following exception

Exception

WARN Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use. (com.amazonaws.services.s3.internal.S3AbortableInputStream)
kafka-security-manager_1  | [2020-03-03 19:57:46,948] ERROR unexpected exception (com.github.simplesteph.ksm.KafkaSecurityManager$)
kafka-security-manager_1  | java.util.concurrent.ExecutionException: java.io.IOException: Attempted read on closed stream.
kafka-security-manager_1  |     at java.util.concurrent.FutureTask.report(FutureTask.java:122)
kafka-security-manager_1  |     at java.util.concurrent.FutureTask.get(FutureTask.java:192)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.KafkaSecurityManager$.delayedEndpoint$com$github$simplesteph$ksm$KafkaSecurityManager$1(KafkaSecurityManager.scala:79)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.KafkaSecurityManager$delayedInit$body.apply(KafkaSecurityManager.scala:18)
kafka-security-manager_1  |     at scala.Function0.apply$mcV$sp(Function0.scala:39)
kafka-security-manager_1  |     at scala.Function0.apply$mcV$sp$(Function0.scala:39)
kafka-security-manager_1  |     at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
kafka-security-manager_1  |     at scala.App.$anonfun$main$1$adapted(App.scala:80)
kafka-security-manager_1  |     at scala.collection.immutable.List.foreach(List.scala:392)
kafka-security-manager_1  |     at scala.App.main(App.scala:80)
kafka-security-manager_1  |     at scala.App.main$(App.scala:78)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.KafkaSecurityManager$.main(KafkaSecurityManager.scala:18)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.KafkaSecurityManager.main(KafkaSecurityManager.scala)
kafka-security-manager_1  | Caused by: java.io.IOException: Attempted read on closed stream.
kafka-security-manager_1  |     at org.apache.http.conn.EofSensorInputStream.isReadAllowed(EofSensorInputStream.java:107)
kafka-security-manager_1  |     at org.apache.http.conn.EofSensorInputStream.read(EofSensorInputStream.java:133)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:180)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:180)
kafka-security-manager_1  |     at java.security.DigestInputStream.read(DigestInputStream.java:161)
kafka-security-manager_1  |     at com.amazonaws.services.s3.internal.DigestValidationInputStream.read(DigestValidationInputStream.java:59)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at com.amazonaws.services.s3.internal.S3AbortableInputStream.read(S3AbortableInputStream.java:125)
kafka-security-manager_1  |     at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:82)
kafka-security-manager_1  |     at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
kafka-security-manager_1  |     at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
kafka-security-manager_1  |     at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
kafka-security-manager_1  |     at java.io.InputStreamReader.read(InputStreamReader.java:184)
kafka-security-manager_1  |     at java.io.BufferedReader.read1(BufferedReader.java:210)
kafka-security-manager_1  |     at java.io.BufferedReader.read(BufferedReader.java:286)
kafka-security-manager_1  |     at java.io.BufferedReader.fill(BufferedReader.java:161)
kafka-security-manager_1  |     at java.io.BufferedReader.read(BufferedReader.java:182)
kafka-security-manager_1  |     at com.github.tototoshi.csv.ReaderLineReader.readLineWithTerminator(ReaderLineReader.java:21)
kafka-security-manager_1  |     at com.github.tototoshi.csv.CSVReader.parseNext$1(CSVReader.scala:33)
kafka-security-manager_1  |     at com.github.tototoshi.csv.CSVReader.readNext(CSVReader.scala:51)
kafka-security-manager_1  |     at com.github.tototoshi.csv.CSVReader.allWithOrderedHeaders(CSVReader.scala:101)
kafka-security-manager_1  |     at com.github.tototoshi.csv.CSVReader.allWithHeaders(CSVReader.scala:97)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.parser.CsvAclParser.aclsFromReader(CsvAclParser.scala:79)
kafka-security-manager_1  |     at com.github.simplesteph.ksm.AclSynchronizer.run(AclSynchronizer.scala:98)
kafka-security-manager_1  |     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
kafka-security-manager_1  |     at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
kafka-security-manager_1  |     at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
kafka-security-manager_1  |     at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
kafka-security-manager_1  |     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
kafka-security-manager_1  |     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
kafka-security-manager_1  |     at java.lang.Thread.run(Thread.java:748)

Configuration

      SOURCE_CLASS: "com.github.simplesteph.ksm.source.S3SourceAcl"
      SOURCE_S3_REGION: "us-east-1"
      SOURCE_S3_BUCKETNAME: "bucket-name"
      SOURCE_S3_OBJECTKEY: "acls.csv"
      AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID}
      AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY}
      AWS_DEFAULT_REGION: ${AWS_DEFAULT_REGION}

YAML support, Directory Support and compound syntax

Hi, as part of a project we implemented support for YAML format and multi-file support. The YAML support allows to write a more human readable permission config:

users:
  C=DE,O=org,OU=WEB,CN=t1.example.com,L=Stuttgart,ST=reg:
    topics:
      Topic1:
       - Read
      TopicPref_*:
        - All
    groups:
      team1-app1-*:
        - Read
        - Describe

In the YAML example you can also see the compound syntax where you define All to give all permissions on a topic.

The multi-file support allows to organize the permissions in different files to make separation of concerns more easy.

The implementation was done around version 0.5 so I would need to invest some time to create a valid pull request, are you interested in merging this kind of features? Otherwise I will spare the time ^^'

DelegationToken not support on Kafka 1.x.x

Hi,
I found a problem using kafka-security-manager release latest or v03-release .
As soon you start the container, also using the default configuration to do not make any changes in zookeeper, the kafka-security-manager is creating a key automatically in /kafka-acl/DelegationToken, after kafka-acls is concerning about the DelegationToken key in zookeeper, as far as I know this feature "DelegationToken" can be used only in the kafka 2, do you have also figured out this problem ?
Thank you,
Frank

Secure Rest API

Currently REST API is open, it would be nice to have some basic authentication.

Publish Maven Artifact

Makes life easy for people to get the jars and extend the project with their own proprietary sources without the need to fork the project

Customizing log4j.properties

Hi, I'm using docker-compose for my kafka-security-manager.
I wanted to set the logLevel to ERROR, but I couldn't. Is there any way to customize that?

Question CSV Multipe Operation possible?

Hi,

is it possible to write multiple operations on same acl line on csv format as on the json version?

Example
Normal Example with Write, Describe and Read Operations
"User:CN=Test",Topic,ClientTopic,Write,Allow,* "User:CN=Test",Topic,ClientTopic,Describe,Allow,* "User:CN=Test",Topic,ClientTopic,Read,Allow,*

Is it possible to write multiple operations on same line?
"User:CN=Test",Topic,ClientTopic,Write;Describe;Read,Allow,*

Json Example where it is possible:
{ "version": 1, "acls": [{ "principals": ["user:alice”, "group: kafka-devs"], "permissionType": "ALLOW", "operations": [ "READ", "WRITE" ], "hosts": ["host1", "host2" ] }] }

(Edit by Stephane): Thumbs up or down based on if you want this feature

S3AclSource

Thumbs up if you want this feature, PR welcome

Support Kafka 2 ACL format

Kafka 2 now supports prefixed and literal resource pattern KAFKA-6841. This currently working with default value for Kafka 2 of literal. It would be great to get KSM to support both using a new column resource-pattern in the input csv file so topics ACLs can be suffixed with wildcards when the resource-pattern is prefixed

Fail to connect to bitbucket as the resource.

Hi Stephane,

I always got the error of "WARN Too many invalid password attempts. Log in at https://id.atlassian.com/ to restore access." When I use bitbucket as my resource. The user name is just my user name, and I created an APP password as the password. I used bitbucket cloud. Do you have any idea about my problem? Thanks

Unable to run once

Hi there,

Application raises java.lang.IllegalArgumentException when refresh.frequency.ms (or KSM_REFRESH_FREQUENCY_MS) is set to "-1".

Error:

Exception in thread "main" java.lang.IllegalArgumentException
	at java.util.concurrent.ScheduledThreadPoolExecutor.scheduleAtFixedRate(ScheduledThreadPoolExecutor.java:565)
	at com.github.simplesteph.ksm.KafkaSecurityManager$.delayedEndpoint$com$github$simplesteph$ksm$KafkaSecurityManager$1(KafkaSecurityManager.scala:55)
	at com.github.simplesteph.ksm.KafkaSecurityManager$delayedInit$body.apply(KafkaSecurityManager.scala:13)
	at scala.Function0.apply$mcV$sp(Function0.scala:39)
	at scala.Function0.apply$mcV$sp$(Function0.scala:39)
	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
	at scala.App.$anonfun$main$1$adapted(App.scala:80)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at scala.App.main(App.scala:80)
	at scala.App.main$(App.scala:78)
	at com.github.simplesteph.ksm.KafkaSecurityManager$.main(KafkaSecurityManager.scala:13)
	at com.github.simplesteph.ksm.KafkaSecurityManager.main(KafkaSecurityManager.scala)

Idea: support for connecting to multiple Zookeeper nodes

Current design requires specifying a single zookeeper node that the Kafka Security Manager service will connect to. It'd be nice if the configuration allowed for a list and rotation logic to handle when a zookeeper node was unavailable due to maintenance/outage.

Alternatively, how about a configuration where multiple Kafka Security Managers are run; one on each zookeeper node? And use Zookeeper to perform leader election. Only the leader instance of Kafka Security Manager would apply ACLs; others would sit idle. Any ZK/KSM node could be taken offline at any time and a new leader would be elected - there'd be no disruption of either ZooKeeper or Kafka Security Manager service.

Authenticate to zookeeper with client certificate

Is it possible to authenticate to zookeeper with client certificate?

If possible, it is not immediately obvious from the examples how to do this.

If not possible yet, it would be a very good feature.

Stop KSM after one run

Hi Stephane,

is there any way how to tell to KSM to stop after the first processing/execution of the ACL?
We want to use external trigger to start KSM (e.g. Github merge to Master or S3 file upload) and then stop it in order to safe the costs.

Would it be possible to add this feature, if it doesn't exist yet?

Thanks and regards,
Petros

Slack Notification Exception Handling

If KSM fails to connect to Slack the KSM app itself terminates seemingly due to a lack of exception handling around sending Slack notifications. Please can some exception handling be added?

[2019-08-27 08:44:14,600] ERROR unexpected exception (com.github.simplesteph.ksm.KafkaSecurityManager$) java.util.concurrent.ExecutionException: java.net.SocketTimeoutException: connect timed out

Docker Build pushed to Docker Hub

Travis expert, PR welcome. Requirements

  • Every build on master is pushed to :latest
  • Every tag / releases is pushed to :tag
  • Convenient way of passing creds from Travis CI to Docker Hub (?)
  • Documentation (README.md)

[Slack Notifications] Use new API

Incoming Webhooks are a simple way to post messages from apps into Slack. These integrations lack newer features and they will be deprecated and possibly removed in the future.

In the current implementation following properties are defined in the configuration file, but username, icon and channel are ignored.

  • webhook
  • username
  • icon
  • channel

Recommend and latest API is following:
https://api.slack.com/methods/chat.postMessage

Add No-Op Source (Read Only KSM)

No-Op source would allow a user to decide not to alter Zookeeper Acl based on an external source. This would allow a read-only version of KSM to run

Support HA mode

Would be great to have a KSM master and a standby for HA mode. PR welcome!

Problem with GRPC Gateway Server

On the latest docker image with v2.0.0 support I get the following error when I try and hit the GRPC gateway REST endpoint:

java.lang.NoClassDefFoundError: javax/activation/MimetypesFileTypeMap at grpcgateway.handlers.SwaggerHandler.<init>(SwaggerHandler.scala:86) at grpcgateway.server.GrpcGatewayServerBuilder$$anon$1.initChannel(GrpcGatewayServerBuilder.scala:36) at grpcgateway.server.GrpcGatewayServerBuilder$$anon$1.initChannel(GrpcGatewayServerBuilder.scala:32) at io.netty.channel.ChannelInitializer.initChannel(ChannelInitializer.java:113) at io.netty.channel.ChannelInitializer.handlerAdded(ChannelInitializer.java:105) at io.netty.channel.DefaultChannelPipeline.callHandlerAdded0(DefaultChannelPipeline.java:617) at io.netty.channel.DefaultChannelPipeline.access$000(DefaultChannelPipeline.java:46) at io.netty.channel.DefaultChannelPipeline$PendingHandlerAddedTask.execute(DefaultChannelPipeline.java:1467) at io.netty.channel.DefaultChannelPipeline.callHandlerAddedForAllHandlers(DefaultChannelPipeline.java:1141) at io.netty.channel.DefaultChannelPipeline.invokeHandlerAddedIfNeeded(DefaultChannelPipeline.java:666) at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:510) at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:423) at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:482) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:844)

This is due to the docker image using "FROM openjdk:latest" as its base which results in it using OpenJDK 10.x. I worked around this issue by putting "--add-modules java.activation" as an option to Java in my docker start script. You might want to consider pinning the version of the base docker image you are using to a known working version. Additionally you might want to consider using the "-slim" version of the OpenJDK base docker image since the current one you are using is almost 500 MB.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.