GithubHelp home page GithubHelp logo

faceverification's Introduction

FaceVerification

A messy code for developing a face verfication program.

It includes a C++ face detection / alignment program, joint bayesian and several supplementary codes. My Caffe model define file is also provided. Note that I use a fresh layer called Insanity to replace the ReLU activation. The Insanity layer can be found in my Caffe repository. Please feel free to use the codes if you need.

If you are also interested in face verification, please contact me via the issue.

The CASIA-webface dataset is really very dirty, and I believe that if someone could wash it up, the accuracy would increase further. If you did so, please kindly contact me. I will pay for it.

Good News: @潘泳苹果皮 and his colleagues have washed the CASIA-webface database manually. After washing, 27703 wrong images are deleted. The washed list can be downloaded from http://pan.baidu.com/s/1hrKpbm8 . Great thanks to them!

Update

2017/04/21 The project page of my new paper, NormFace: L2 HyperSphere Embedding for Face Verification, is created on https://github.com/happynear/NormFace. Using the new loss functions to fine-tune a network, the accuracy will increase a little bit higher.

2017/02/18 I trained a center-face model on MS-Celeb-1M dataset and get 99.3% on LFW. Here is the model (http://pan.baidu.com/s/1jIJT4Rc) and the aligned LFW images (http://pan.baidu.com/s/1bp7qzJh). To run the evaluation, you need to load LFW pairs by getPairs.m in aligned_lfw.zip, extract feature by ReadFeatureLFW.m and get the accuracy by lfwPCA.m. The function pcaApply used in lfwPCA.m is from pdollar-toolbox.

Recently I talked with Yandong Wen. He said that there were more than 1,000 identities' overlap between MS-Celeb-1M and LFW. So this accuracy, 99.3%, is not a reliable value, and the performance on other datasets may not be very good.

2015/07/05 Added a matlab face alignment wrapper (MatAlignment.cpp). Now you can do the face alignment job in matlab. A demo (VerificationDemo.m) is also privided.

Progress

  1. Training DeepID (pure softmax network).

    create database. done.

    iteration 360,000, lr=0.01,

     lfw verification: L2 : 95.9%, jb : 
    

    iteration 500,000, lr=0.001,

     lfw verification: L2 : 96.8%, jb : 93.3% (strongly overfit, it's >99% for lfw training set).
    

    iteration 660,000, lr=0.0001,

     lfw verification: L2 : 96.78% (converged)
    

    Accuracy on training set is about 89.5%~91.5%. LFW result with L2 or cosine has reached what the paper claimed. Joint Bayesian seems to be strongly overfit. The main reason is that I only train Joint Bayesian on the lfw training set, not CASIA-WebFace. Joint Bayesian for over 10,000 classes is too costy for my PC.

    This model is public available now: http://pan.baidu.com/s/1qXhNOZE . Mean file is in http://pan.baidu.com/s/1eQYEJU6 .

    Another model with resolution of 64*64 is trained. By ensembling the two models, accuracy increases to 97.18%.

  2. Training DeepID2 (siamese network)

    create database. done.

    My GPU memory is only 2GB. It is too slow to train a siamese network. I need Titan X!!!

faceverification's People

Contributors

happynear avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

faceverification's Issues

Follow your step,but only get 73% verification correct rate

First, thank you for your contribution.

I follow your step, used CASIA data and 'mnist_siamese_solver.prototxt', 'CASIA_train_test.prototxt' in 'caffe_proto' folder training CNN without mean file by caffe. After 500,000 iterations, validation set accuracy rate is about 72%. Cnn seems already fitting.

Then, I write matlab code to extracting features and L2 Classification for LFW dataset. But only get 73% verification correct rate.

The matlab code is:

dr_lfw_features = [];
for i= 1:length(fileLines)
    im = imread( fileLines{i} );    
    im = rgb2gray(im);
    im = imresize(im, [100 100]);
    im = single(im);
    im = im /128.0;
    im = im';
    drop5_data = net.forward({im});
    drop5_data = reshape(drop5_data{1}, 1, 320);
    fprintf('iamge is %s, i = %d\n', fileLines{i}, i);
    dr_lfw_features = [dr_lfw_features; drop5_data];
end

load('pairlist_lfw.mat');
test_Intra = pairlist_lfw.IntraPersonPair;
test_Extra = pairlist_lfw.ExtraPersonPair;
test_pair = [test_Intra; test_Extra];

F1_index = test_pair(:,1);
F2_index = test_pair(:,2);

AllFeature1 = dr_lfw_features(F1_index, :);
AllFeature2 = dr_lfw_features(F2_index, :);

num = length(AllFeature1);
F1 = AllFeature1;
F1 = bsxfun(@rdivide, F1, sqrt(sum(F1.^2,2)));
F2 = AllFeature2;
F2 = bsxfun(@rdivide, F2, sqrt(sum(F2.^2,2)));
thresh2 = zeros(num,1);
for i = 1:num
    thresh2(i) = pdist2( F1(i,:),F2(i,:) );
end;
figure;
hist(thresh2(1:3000), 500);
figure;
hist(thresh2(3001:end), 500);

accuracies = zeros(10,1);
for i=1:10
    test_idx = [(i-1) * 300 + 1 : i*300, (i-1) * 300 + 3001 : i*300 + 3000];
    train_idx = 1:6000;
    train_idx(test_idx) = [];
    bestc=256;
    same_label = ones(6000,1);
    same_label(3001:6000) = 0;
    cmd = [' -t 0 -h 0'];
    model = svmtrain(same_label(train_idx),thresh2(train_idx),cmd);
    [class, accuracy, deci] = svmpredict(same_label(test_idx),thresh2(test_idx),model);
    accuracies(i) = accuracy(1);
end;
mean(accuracies)
cmd = [' -t 0 -h 0'];
model = svmtrain(same_label,thresh2,cmd);
[class, accuracy, deci] = svmpredict(same_label,thresh2,model);

By hist histogram, you can see the same data threshold and different data threshold obviously inseparable.

I think there are errors in this matlab code, do you have any good suggestions?

Hi

我在lfw上用你百度网盘里面的CASIA_iter_666000.caffemodel和mean.proto测试出来的准确率只有49%
为何 我用你的CASIA_deploy.prototxt 提取的pool5层的320维特征 做余弦相似度计算~
阈值用的0.4 是否应该更高?

About joint bayesian

你好。
都是**的,我就说中文了。
这段JB的代码,我只复现了在原文提供的那个数据集上的性能。
请问你JB的结果有没有好于L2的时候?
即使我用webface去训练JB,结果也没L2或者PCA好:(

hi,

可以邀请你做一些人脸识别模型训练的工作吗?我们可以付酬哦,谢谢!

Your centerloss model on MS-Celeb-1M

Hi,
When I tried to use your centerloss model trained on MS-Celeb-1M, I met some problems:
1, clearly, "Flip" layer is not supported in official caffe version. Would you mind providing the source of your caffe?
2, In some convolutional layers, there is an 'alpha' parameter in weight_filler, which is also not supported in offiical caffe version.

BTW, would you mind sharing your solver.prototxt and train_val.prototxt, since it seems like yours is a little different from the original centerloss repo?

Thanks.

数据预处理

不太懂为什么数据预处理要这样做,求解答~~
X = sqrt(X);
X = bsxfun(@RDivide,X,sum(X,2));

Which proto to choose

Hi,I'm a new for caffe.I download your files,in the folder of caffe_proto I saw mnist_siamese_train_test.prototxt and a CASIA_train_test.prototxt,which is best for the face vertification.
The second problem is that where is the train.txt and the val.txt?Can you send me if you please(I dowmloaded the @潘泳苹果皮 database).(My email is [email protected])Thanks .

how can i init the net?

Hi, I try to train the net using this code , but I can not see the loss reduce. So I think if there is some init method?

大神您好

大神您好,我有个疑问需要请教。
我已经尝试了大小6个神经网络(除了vgg),我的验证集是从webface中每个目录下选择第一张得来的,是剪切分离走的。
我无论如何训练,尝试各种参数和人脸对齐方法,甚至修改了caffe的学习率和动量调整机制,最高也只获得77%的验证率。
现在我发现很多大神都能得到90%左右的验证率,这是很惊人的,我想知道,验证集是该单独剪切出来,还是复制出来,也就是训练集是否应该保留所有的webface图片。

Face Verification

Hello Sir,
Thank you for the models and codes you have provided. We are planning to train a face verification model on the CASIA-WebFace dataset from scratch using the "mnist_siamese_train_test.prototxt" model you provided. I have gone through the papers you specified in the previous issues already. How are you generating the training pairs and also how are you labeling them? Thank you in advance.

Face Verification using caffe step by step

@happynear
I tried to test the this project (Face Verification demo), But i couldn't.
VerificationDemo.m is not execution file?
By the way, i want c/c++ version,
Would you provide me with it?
honestly, I am a newbie with caffe, so i hope you are guiding me step by step and will be successful to test the face verification at the end of this session.
Thanks in advance.

Washed CASIA-webface data set identities

Hi, is there a list of identities for the Washed CASIA-webface data set? There are just numbers per each identity. I would like to use several databases for training and would like to remove identities that appear at more than one database.

About recognition

Hi:
I'm confused about two question ,could explain for me.

first i want to know the data you train the net like deepid is using single patch?

the next is have you take a classification to realize the Face recognition with the Features extract from the net like SVM,how much is the accuracy for that。

Face Verification(Caffe Newbie)

Hi, i am new to deep learning and Caffe, may i know how should i use your face verification code? I had built the windows caffe as mentioned in your blog and test the MNist as well. But what should i do to proceed face verification with Caffe?

lbp_lfw_baseline_cvpr13.mat

thanks provide code

I want to use test_lfw.m but i can't find "lbp_lfw_baseline_cvpr13.mat" this file.

where can i get this mat file.THANKS

help

could you please provide me a deepid2 model based on linux,thank you very much!

Finetune your 96.8 model

Hello, thank you for all your contributions. I was planning to finetune your 96.8 model which you provided in one of the issues (#22) on my dataset. The "CASIA_train_test.prototxt" is the training.prototxt for the model you provided I think. Can you please let me know how to label the training data to finetune this? or how to generate the train.txt file?

about face align

大神您好。aligned_lfw.zip 中矫正好的图片您是用什么方法得到的呢?
我在重现ydwen的实验时,我用自己矫正的lfw数据集做测试,accuracy只有0.981,而用您矫正的LFW做测试时accuracy有0.988。
我使用的是mtcnn做的人脸检测和关键点检测。

Insanity layer

Hi,

Could you please provide some information/papers regarding to Insanity layer?

Thanks

NN architecture

Hi, thanks for the project.

Are there any papers/description of the NN architecture you use except "Learning face representation from scratch"?

Could you please give some links/description to your "Insanity" layer?

Regards..

Train centerloss face with L2 normalization layer

Hi,

I'm currently trying to train the center loss face with L2 normalization layer, more specifically, adding a L2 normalization layer after fc5 before feeding it into the last FC layer. However, after adding this L2 normalization layer, softmax loss is decreasing very slow, compared to the one without L2 normalization layer.

Suggested by others in this issue that the initialization after L2 normalization layer should be carefully chosen, I tried uniform, Gaussian and Xavier. Only uniform can make the softmax loss decease a little bit faster, but still very slow.

Do you have an idea how to train the center loss face with L2 normalization layer? I assume you also used this layer during the training of centerloss face with MS-Celeb-1M since I found this layer in your provided prototxt file, although you commented it.

Any suggestion is appreciated.

LFW with JB

HI

I want to test LFW with JB. I refer your code (https://github.com/happynear/FaceVerification/blob/master/test_lfw.m)

But i feel some weird part with 10-fold cross.
testing = 10;
tmp = pairlist_lfw.IntraPersonPair;
tmp((testing-1)_300+1:testing_300,:) = [];
train_Intra = tmp;
idx = [idx;tmp(:)];
tmp = pairlist_lfw.ExtraPersonPair;
tmp((testing-1)_300+1:testing_300,:) = [];
train_Extra = tmp;

testing = 8;
test_Intra = pairlist_lfw.IntraPersonPair((testing-1)_300+1:testing_300,:);
test_Extra = pairlist_lfw.ExtraPersonPair((testing-1)_300+1:testing_300,:);

train and test data are overlapping. It is correct? Or i should refer your other file? (lfwJB.m ,test_lfw_validate.m) THANK!

About threshold for face authentication

Hi
I'm a beginner on face verification and want to ask for you few questions.
how much is the best threshold for face authentication.
image
Do you have a better model for face verification now? My own model is not good.
Do you know how to calculate the probability that the two faces belong to one person。
many thanks.

ContrastiveLoss Layer takes 3 vs. 4 bottom blob(s) as input?

My ContrastiveLoss Layer is as flows like you:

layer {
name: "loss"
type: "ContrastiveLoss"
bottom: "fc6_1"
bottom: "fc6_2"
bottom: "label_1"
bottom: "label_2"
top: "fc6_loss"
contrastive_loss_param{
margin: 0
}
loss_weight: 0.001
}

but it occurs the following error messages:

Check failed: ExactNumBottomBlobs() == bottom.size() (3 vs. 4) ContrastiveLoss Layer takes 3 bottom blob(s) as input.

So I want to know if do you have the same problem?

请问网络收敛的速度

你好,非常感谢共享关于人脸的网络模型。我现在有一个问题,就是我同样使用Casia的数据,但是训练时候网络收敛的非常慢,lr = 0.01时,迭代几千次后 softmax_loss 依然在 8.9 左右, test accuracy 在 0.001 左右,请问这是否有问题? 谢谢!

About DeepID2

hi,
Recently, I still try to implement the DeepID model from DeepID1 to DeepID2, however, I have implemented your DeepID1 model, but the DeepID2 model is always incorrect, the model is not only non-convergent but also wrong. So, I wonder if you still research the DeepID2 model and have you made some new progress?

caffe: Siamese network not converge

Hello
I'm currently testing the Siamese algorithm to study the similarity between rgb images.
However, I have a convergence problem (I think it's because of the margin m).

Thanks for any help

有相应的论文吗

抱歉打扰您,请问对应的人脸验证论文吗,没有看到论文的链接呢

清理后的数据文件损坏不能打开

您好,

由 @潘泳苹果皮清理的 CASIA-webface dataset ,我在百度云上下载之后,文件损坏不能打开。(试了好几次)。想请问您,您那边下载下来的文件可以用吗?

谢谢

deepid caffe实现训练问题

在caffe上跑deepid训练,数据集CASIA,训练过程中,验证率只能达到60%-65%波动,请问大神你的模型训练过程中验证率 test-net accuracy能达到多少,谢谢

FaceExtract.cpp在哪里使用的????

hi,Mr.Wang:
我下载了你的FaceVerification但是里面的MatAlignment.cpp一直调试不成功,所以我就自己调试了dlib的自带的人脸对齐程序,把图像经过dlib对齐程序对齐之后还需要什么处理吗,需不需要使用你的FaceExtract.cpp把图像剪裁,是先剪裁在对齐还是先对齐在剪裁,把你大体的流程说一下好吗?谢谢,万分感谢

input data/diff size does not match target blob shape, input data/diff size: [ 100 100 1 2 ] vs target blob shape: [ 100 100 1 100 ]

大神,运行你的VerificationDemo.m 出现这个问题,请问是什么原因呢?

VerificationDemo
Error using CHECK (line 4)
input data/diff size does not match target blob shape, input data/diff size: [ 100 100 1 2 ] vs target blob shape: [
100 100 1 100 ]

Error in caffe.Blob/check_data_size_matches (line 72)
CHECK(is_matched, ...

Error in caffe.Blob/check_and_preprocess_data (line 46)
self.check_data_size_matches(data);

Error in caffe.Blob/set_data (line 26)
data = self.check_and_preprocess_data(data);

Error in caffe.Net/forward (line 122)
self.blobs(self.inputs{n}).set_data(input_data{n});

Error in VerificationDemo (line 53)
f = net.forward(H);

另外:
f = f{1};
fprintf('distance:%f\n',f);
if f<same_thresh
fprintf('The two faces are from the same people.\n');
else
fprintf('The two faces are from different people.\n');
end;
f是一个数组,不知道最后的相似度分数是如何计算出来的?
这个判断是不是写反了哦, 小于阈值的是同一个人,反之为不同人。

Exception occurs when loading model

In [5]: net = caffe.Net('CASIA_deploy.prototxt', caffe.TEST)
[libprotobuf ERROR google/protobuf/text_format.cc:245] Error parsing text-format caffe.NetParameter: 28:16: Message type "caffe.FillerParameter" has no field named "alpha".
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0816 19:25:50.492918 22483 upgrade_proto.cpp:79] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: CASIA_deploy.prototxt

安装与运行

hi,Mr.Wang,
我是从caffe-windows来到这里的,很高兴看到您的这个项目有了很大进展,首先祝贺一下。
matlab不熟,我想了解一下,这个程序如何才能运行?我已经迫不及待想验证一下了,呵呵:)
祝好!

loss the layer: concate_label

At first, it loss the layer :concate_label.
I also confused if we propogate concate_label into softmax_loss, its loss equals the sum of two single softmax_loss.

verification problem

您好:

我是Caffe的使用者,我使用CASIA database 訓練出identification 達到82%,但是我想把特徵擷取下來做JB,但是一直無法把辨識率做到90%已上,Feature extract 我是用caffe提供的matlab的函式去做截取的,我在想會不會是擷取特徵上的問題,請問大師你是用哪個方法擷取特徵的呢?

关于特征提取

请问特征提取中的f =f{1}是什么意思呢,我调试看好象是取了f的第一列,为什么只取第一列呢,后面的都不要了吗,打扰了

Is PCA helpful?

I recently reproduced your result with your 99.3% model in your matlab code.
There is a interesting thing: in the lfwPCA.m, if you replace thresh1 by thresh2 when training the SVM, the result accuracy is as well as thresh1, about 99.25%.
In my test, i changed [U,mu,vars] = pca( [F1_mu(train_idx,:); F2_mu(train_idx,:)]' ); with [U,mu,vars] = pca( [F1_mu(train_idx,:); F2_mu(train_idx,:)] ); , with the matlab version 2015b. And, i downloaded a applyPCA.m because i didn't find the function in your code. i‘m not sure if these changes matter.

Anyway, i trained a SVM model using the 6000 pairs in LFW with opencv, and used it to test the whole possible pairs formed by total images in LFW. Unfortunately, the acc in same subject pairs is about 82% with the acc in different subject pairs 45%. Do you have some light on this phenomenon?

Invitation from a facial analysis community

Hi, happynear,

Thanks for such an awesome work on face verification. At present, we have built a Tencent academic group to discuss about face detection, alignment and recognition. We are hereby to invite you to join our group and share stuffs on your projects. At the meantime, I think you can also learn from others (teachers and peer students) in our group. It would be great if you choose to join us.

you can find the group number on this page: https://github.com/jwyang/face-alignment

About the label

Hi,
I am puzzled with the "label1" of the layer "pair_data1" and "label2" of the layer "pair_data2". Are the "label1" and "label2" mean class label? Aren't the "same/different" labels need to be generated for each pair before training?
Thanks.

What is the most proper for face verification

Hi feng wang.
I tracked all your research for face verification.
This repo looks some messy as you said, but makes we can see your efforts all and so helpful as well.
You must took trouble and superb experience so much through those days.
First of all, i appreciate to your providing of refined CASIA database, and also to makers.

I want ask a question, what was the best method for face verification among siamese, triplet, joint baysian in aspect of accuracy? which method did you select?
How much accuracy did you achieve at most on lfw so far?

questions about models.

  • you set the batch size as 2 and used euclidean loss in casia_demo,

2 batch size seems not good, loss would very fluctuate.
Did you align order of the dataset in training for two pics from same person makes pair? like below
assume right
first_person_1
first_person_2
second_person_1
second_person_2
assume wrong
first_person_1
second_person_1
in this case, loss is for other one, so it should be increased, not for optimization.

  • you set the 3 loss layers in train in siamese.

in this case, which loss will caffe decrease for optimization?
finally, I contacted xiang wu you referred and tested his model, but accuracy on lfw does not arrive at the one he reported in his repo. just 93% for A_model on lfw without mean image.
i know this question is not for you, but if you have any information about that, i want to ask it also.
Thanks for your helpful reply in advance.

DeepID2

@happynear
Hi, I have researched the face verification for a period of time and keep learning DeepID recently. I found that your implement the DeepID2 prototxt in "mnist_siamese_train_test". However, I have a problem that need your help.

  1. Firstly, how do you prepare the datasets(Webface), I am not very clear about your webface_pair1_train and webface_pair2_train and webface_pair1_test and webface_pair2_test, how do you partiton the dataset and why you have two training sets and test sets.
    Thanks very much!

confuse on CASIA_train_test.prototxt training loss

I use the CASIA_train_test.prototxt with my dataset(2000person and 50 image for each total 10W+).
I want to training deepid network using CASIA_train_test.prototxt. Only change the "ip1" layer output num from 10575 to 2000.
but it cannot make the soft loss reduce, and the test accuracy is very small.

In the right case, what's the accuracy will be achieved?
and what about the loss?

my alignment data like this.
aaron_eckhart_001
aaron_eckhart_002

DeepID model

I have some trouble with understanding these models you gave out. Cuz I'm not familiar with how siamese network works in Caffe, I can't see why for example conv11 and conv12 have different numbers of output (asymmetric). Have you tried training without a siamese structure like what the first version of DeepID does? I've been working on this for weeks however the best result I've achieved is only about 82%. Did you also crop patches to train ensemble models?

How to generate MatAlignment.mexw64 with opencv 3.0

hi,
I builded the caffe-windows from your link https://github.com/happynear/caffe-windows and generated the caffe_.mexw64 file
Then build the VerificationDemo.m in the MATLAB by addpath(caffe matlab path)
It shows error:
Invalid MEX-file 'D:\FaceVerification-master\FaceVerification-master\MatAlignment.mexw64': The specified module could not be found.

I used dependency walker to check the dependencies for the MatAlignment.mexw64 file
It shows
OPENCV_CORE248.DLL error opening file.the system cannot find the file specified

I have opencv 3.0 ,How to build MatAlignment.mexw64 with opencv 3.0?

FaceVerification in caffe-windows

Hi, Mr.Wang,
I found a directory named "faceverification" and also faceverification.cpp in caffe-windows, which is another project owned by you, is there any differences between that one and here?

I compiled the faceverification.cpp with caffe-windows successfully, however, I do not know how to train a model which could be used by it:

DEFINE_string(model, "",
"The model definition protocol buffer text file..");
DEFINE_string(weights, "",
"Optional; the pretrained weights to initialize finetuning. "

"Cannot be set simultaneously with snapshot.");

Net caffe_test_net(FLAGS_model, TEST);

I could not find any document on how to train the related model, there is a run_verification.bat and 2 prototext files, but I do not know how to use it :)

BTW, Is there any pre-trained model which I could use in this project? thanks a lot.

B.R
Anguo Yang

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.