GithubHelp home page GithubHelp logo

ztachip / ztachip Goto Github PK

View Code? Open in Web Editor NEW
201.0 8.0 27.0 168.67 MB

Opensource software/hardware platform to build edge AI solutions deployed on FPGA or custom ASIC hardware.

License: MIT License

Scala 0.50% VHDL 59.19% C++ 33.05% C 2.41% Objective-C 1.73% OpenEdge ABL 0.84% M 0.04% Assembly 0.18% Lex 0.11% Makefile 0.17% Yacc 0.66% Python 0.29% Tcl 0.19% MATLAB 0.63%

ztachip's Introduction

Introduction

ztachip is a RISCV accelerator for vision and AI edge applications running on low-end FPGA devices or custom ASIC.

Acceleration provided by ztachip can be up to 20-50x compared with a non-accelerated RISCV implementation on many vision/AI tasks. ztachip performs also better when compared with a RISCV that is equipped with vector extension.

An innovative tensor processor hardware is implemented to accelerate a wide range of different tasks from many common vision tasks such as edge-detection, optical-flow, motion-detection, color-conversion to executing TensorFlow AI models. This is one key difference of ztachip when compared with other accelerators that tend to accelerate only a narrow range of applications only (for example convolution neural network only).

A new tensor programming paradigm is introduced to allow programmers to leverage the massive processing/data parallelism enabled by ztachip tensor processor.

ztachip demo video

Documentation

Technical overview

Hardware Architecture

Programmers Guide

VisionAI Stack Programmers Guide

MicroPython Programmers Guide

Code structure

  • SW/compiler: compiler to generate instructions for the tensor processor.

  • SW/apps: vision and AI stack implementation. Many prebuilt acceleration functions are provided to provide programmers with a fast path to leverage ztachip acceleration. This folder is also a good place to learn on how to program your own custom acceleration functions.

  • SW/base: SW framework library and some utilities

  • SW/fs: read-only file system to be downloaded together with the build image.

  • SW/src: codes for the reference design example. This is a good place to learn on how to use ztachip prebuilt vision and AI stack.

  • HW/examples: HDL codes for the reference design.

  • HW/examples/GHRD/MyVexRiscv.scala: RISCV core used in this example is based on VexRiscv implementation. This file is used by VexRiscv project to generate the Riscv core.

  • HW/platform: This is a thin wrapper layer to help ztachip to be synthesized efficiently on different FPGA or ASIC. Choose the appropriate sub-folder that corresponds to your FPGA target. A generic implementation is also provided for simulation environment. Any FPGA/ASIC can be supported with the appropriate implementation of this wrapper layer.

  • HW/src: main ztachip HDL source codes.

Build procedure

The build procedure produces 2 seperate images.

One image is a standalone executable where user applications are using ztachip using a native [C/C++ library interface] (https://github.com/ztachip/ztachip/raw/master/Documentation/visionai_programmer_guide.pdf)

The second image is a micropython port of ztachip. With this image, applications are using ztachip using a Python programming interface

Prerequisites (Ubuntu)

sudo apt-get install autoconf automake autotools-dev curl python3 libmpc-dev libmpfr-dev libgmp-dev gawk build-essential bison flex texinfo gperf libtool patchutils bc zlib1g-dev libexpat-dev python3-pip
pip3 install numpy

Download and build RISCV tool chain

The build below is a pretty long.

export PATH=/opt/riscv/bin:$PATH
git clone https://github.com/riscv/riscv-gnu-toolchain
cd riscv-gnu-toolchain
./configure --prefix=/opt/riscv --with-arch=rv32im --with-abi=ilp32
sudo make

Download ztachip

git clone https://github.com/ztachip/ztachip.git

Build ztachip as standalone image

export PATH=/opt/riscv/bin:$PATH
cd ztachip
cd SW/compiler
make clean all
cd ../fs
python3 bin2c.py
cd ..
make clean all -f makefile.kernels
make clean all

Build ztachip as micropython port

You are required to complete the previous build procedure for standalone image even if your target image is micropython image. Below is procedure to build micropython image after you have completed the standalone image build procedure.

git clone https://github.com/micropython/micropython.git
cd micropython/ports
cp -avr <ztachip installation folder>/micropython/ztachip_port .
cd ztachip_port
export PATH=/opt/riscv/bin:$PATH
export ZTACHIP=<ztachip installation folder>
make clean
make

Build FPGA

  • Download Xilinx Vivado Webpack free edition.

  • Create the project file, build FPGA image and program it to flash as described in FPGA build procedure

Run reference design example

The following demos are demonstrated on the ArtyA7-100T FPGA development board.

  • Image classification with TensorFlow's Mobinet

  • Object detection with TensorFlow's SSD-Mobinet

  • Edge detection using Canny algorithm

  • Point-of-interest using Harris-Corner algorithm

  • Motion detection

  • Multi-tasking with ObjectDetection, edge detection, Harris-Corner, Motion Detection running at same time

To run the demo, press button0 to switch between different AI/vision applications.

Preparing hardware

Reference design example required the hardware components below...

Attach the VGA and Camera modules to Arty-A7 board according to picture below

arty_board

Connect camera_module to Arty board according to picture below

camera_to_arty

Open serial port

If you are running ztachip's micropython image, then you need to connect to the serial port. Arty-A7 provides serial port connectivity via USB. Serial port flow control must be disabled.

sudo minicom -w -D /dev/ttyUSB1

Note: After the first time connecting to serial port, reset the board again (press button next to USB port and wait for led to turn green) since USB serial must be the first device to connect to USB before ztachip.

Download and build OpenOCD package required for GDB debugger's JTAG connectivity

In this example, we will load the program using GDB debugger and JTAG

sudo apt-get install libtool automake libusb-1.0.0-dev texinfo libusb-dev libyaml-dev pkg-config
git clone https://github.com/SpinalHDL/openocd_riscv
cd openocd_riscv
./bootstrap
./configure --enable-ftdi --enable-dummy
make
cp <ztachip installation folder>/tools/openocd/soc_init.cfg .
cp <ztachip installation folder>/tools/openocd/usb_connect.cfg .
cp <ztachip installation folder>/tools/openocd/xilinx-xc7.cfg .
cp <ztachip installation folder>/tools/openocd/jtagspi.cfg .
cp <ztachip installation folder>/tools/openocd/cpu0.yaml .

Launch OpenOCD

Make sure the green led below the reset button (near USB connector) is on. This indicates that FPGA has been loaded correctly. Then launch OpenOCD to provide JTAG connectivity for GDB debugger

cd <openocd_riscv installation folder>
sudo src/openocd -f usb_connect.cfg -c 'set MURAX_CPU0_YAML cpu0.yaml' -f soc_init.cfg

Uploading SW image via GDB debugger

Upload procedure for standalone SW image option

Open another terminal, then issue commands below to upload the standalone image

export PATH=/opt/riscv/bin:$PATH
cd <ztachip installation folder>/SW/src
riscv32-unknown-elf-gdb ../build/ztachip.elf

Upload procedure for micropython SW image option

Open another terminal, then issue commands below to upload the micropython image.

export PATH=/opt/riscv/bin:$PATH
cd <Micropython installation folder>/ports/ztachip_port
riscv32-unknown-elf-gdb ./build/firmware.elf

Start the image transfer

From GDB debugger prompt, issue the commands below This step takes some time since some AI models are also transfered.

set pagination off
target remote localhost:3333
set remotetimeout 60
set arch riscv:rv32
monitor reset halt
load

Run the program

After sucessfully loading the program, issue command below at GDB prompt

continue

Running standalone image

If you are running the standalone image, press button0 to switch between different AI/vision applications. The sample application running is implemented in vision_ai.cpp

Running micropython image

If you are running the micropython image, Micropython allows for entering python code in paste mode at the serial port.
To use the paste mode, hit Ctrl+E then paste one of the examples to the serial port, then hit ctrl+D to execute the python code.

Hit any button to return back to Micropython prompt.

How to port ztachip to other FPGA,ASIC and SOC

Click here for procedure on how to port ztachip and its applications to other FPGA/ASIC and SOC.

Run ztachip in simulation

First build example test program for simulation. The example test program is under SW/apps/test and SW/sim

export PATH=/opt/riscv/bin:$PATH
cd ztachip
cd SW/compiler
make clean all
cd ..
make clean all -f makefile.kernels
make clean all -f makefile.sim

Then compile all RTL codes below for simulation

HW/src
HW/platform/simulation
HW/simulation

The top component of your simulation is HW/simulation/main.vhd

main:reset_in must be driven low for few clocks before going high.

main:clk_x2_main must be twice the speed of main:clk_main and in phase.

The main:led_out should blink everytime a test result is passed.

Contact

This project is free to use. But for business consulting and support, please contact [email protected]

Follow ztachip on Twitter: https://twitter.com/ztachip

ztachip's People

Contributors

fedy0 avatar ztachip avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ztachip's Issues

couldnt find many documentation and open communiction channels

hey, iam an engineering student and is new to this project and want to know more about this,can i use other fpga boards other the recommended one and can i get link for the slack channel .I would also like to know about the overall structure of the project

No Video Output

Hi! My issue in #5 has been resolved. Thanks to the reference you gave.

Now the program loads successfully. But now the issue is that there is no display on the LCD screen. I think It doesn't detect any video signals from the FPGA device. I checked the VGA cable and it works properly. I am not sure about the camera but even if the camera is not working at least the LCD should show blank display.

According to my analysis there can be several reasons.

First one can be these lines in the constraint file which I didn't change:

set_property CLOCK_DEDICATED_ROUTE FALSE [get_nets CAMERA_PCLK_IBUF]
set_property CLOCK_DEDICATED_ROUTE BACKBONE [get_nets clk_wiz_inst/inst/clk_in1_clk_wiz_0] # Wrote to avoid error during implementation
set_property C_CLK_INPUT_FREQ_HZ 300000000 [get_debug_cores dbg_hub]
set_property C_ENABLE_CLK_DIVIDER false [get_debug_cores dbg_hub]
set_property C_USER_SCAN_CHAIN 1 [get_debug_cores dbg_hub]
connect_debug_port dbg_hub/clk [get_nets clk]

These lines of code were written for Arty A7 100T and I am using Genesys 2 Kintex-7 so doesn't these lines need any change due to change in device? Also the figure 300000000 in the second line becomes 300 MHz (not 30 MHz) but in the usb_connect.cfg file the frequency written is 30 MHz.

Second reason can be that I am using built-in VGA connector of Genesys 2 instead of Pmod VGA. Given below is the schematic of VGA connector of my device:
Screenshot from 2023-04-04 16-53-38
This VGA connector supports RGB565 color display where as the vga.vhd file uses the algorithm for RGB444 color scheme. I tried changing it but I couldn't properly understand the algorithm. So what I did is that I used Genesys-2-Master.xdc file from Digilent to understand the connections of built-in VGA connector and just used first 4 RGB signals from 565 scheme like below:

# VGA Connector
set_property -dict { PACKAGE_PIN AH20  IOSTANDARD LVCMOS33 } [get_ports { VGA_B[0] }]; #IO_L22N_T3_12 Sch=vga_b[3]
set_property -dict { PACKAGE_PIN AG20  IOSTANDARD LVCMOS33 } [get_ports { VGA_B[1] }]; #IO_L22P_T3_12 Sch=vga_b[4]
set_property -dict { PACKAGE_PIN AF21  IOSTANDARD LVCMOS33 } [get_ports { VGA_B[2] }]; #IO_L19N_T3_VREF_12 Sch=vga_b[5]
set_property -dict { PACKAGE_PIN AK20  IOSTANDARD LVCMOS33 } [get_ports { VGA_B[3] }]; #IO_L24P_T3_12 Sch=vga_b[6]
# set_property -dict { PACKAGE_PIN AG22  IOSTANDARD LVCMOS33 } [get_ports { vga_b[4] }]; #IO_L20P_T3_12 Sch=vga_b[7]

set_property -dict { PACKAGE_PIN AJ23  IOSTANDARD LVCMOS33 } [get_ports { VGA_G[0] }]; #IO_L21N_T3_DQS_12 Sch=vga_g[2]
set_property -dict { PACKAGE_PIN AJ22  IOSTANDARD LVCMOS33 } [get_ports { VGA_G[1] }]; #IO_L21P_T3_DQS_12 Sch=vga_g[3]
set_property -dict { PACKAGE_PIN AH22  IOSTANDARD LVCMOS33 } [get_ports { VGA_G[2] }]; #IO_L20N_T3_12 Sch=vga_g[4]
set_property -dict { PACKAGE_PIN AK21  IOSTANDARD LVCMOS33 } [get_ports { VGA_G[3] }]; #IO_L24N_T3_12 Sch=vga_g[5]
# set_property -dict { PACKAGE_PIN AJ21  IOSTANDARD LVCMOS33 } [get_ports { vga_g[4] }]; #IO_L23N_T3_12 Sch=vga_g[6]
# set_property -dict { PACKAGE_PIN AK23  IOSTANDARD LVCMOS33 } [get_ports { vga_g[5] }]; #IO_L17P_T2_12 Sch=vga_g[7]

set_property -dict { PACKAGE_PIN AK25  IOSTANDARD LVCMOS33 } [get_ports { VGA_R[0] }]; #IO_L15N_T2_DQS_12 Sch=vga_r[3]
set_property -dict { PACKAGE_PIN AG25  IOSTANDARD LVCMOS33 } [get_ports { VGA_R[1] }]; #IO_L18P_T2_12 Sch=vga_r[4]
set_property -dict { PACKAGE_PIN AH25  IOSTANDARD LVCMOS33 } [get_ports { VGA_R[2] }]; #IO_L18N_T2_12 Sch=vga_r[5]
set_property -dict { PACKAGE_PIN AK24  IOSTANDARD LVCMOS33 } [get_ports { VGA_R[3] }]; #IO_L17N_T2_12 Sch=vga_r[6]
# set_property -dict { PACKAGE_PIN AJ24  IOSTANDARD LVCMOS33 } [get_ports { vga_r[4] }]; #IO_L15P_T2_DQS_12 Sch=vga_r[7]

set_property -dict { PACKAGE_PIN AF20  IOSTANDARD LVCMOS33 } [get_ports { VGA_HS_O }]; #IO_L19P_T3_12 Sch=vga_hs
set_property -dict { PACKAGE_PIN AG23  IOSTANDARD LVCMOS33 } [get_ports { VGA_VS_O }]; #IO_L13N_T2_MRCC_12 Sch=vga_vs

I am not sure if this was correct way or not. I checked about 565 scheme and I came to know that the extra signals are just to enhance the quality and nothing else so it should work fine.

Kindly help me out in this issue if you understand the problem. Thank you!

Error: 'JTAG scan chain interrogation failed: all zeroes' while launching OpenOCD

Hey! I hope you're doing fine.
I am using this project with my FPGA device Genesys 2 Kintex-7 (XC7K325T-2FFG900C) and Vivado ML Edition 2022.1
Link to its reference manual : https://digilent.com/reference/programmable-logic/genesys-2/reference-manual

I did my best to change all the settings in IP cores and in main.v file for my device. I changed the DDR3 configuration, clock frequency etc. Then I changed its constraint file for my device. Instead of using Pmod VGA (because of unavailability) I used the built-in VGA connector (DB15) on mu FPGA device. I couldn't find anything to change in vga.vhdl but I changed the connections of VGA in constraint file. Although last following lines of the constraint file remain the same:

set_property CLOCK_DEDICATED_ROUTE FALSE [get_nets CAMERA_PCLK_IBUF]
set_property CLOCK_DEDICATED_ROUTE BACKBONE [get_nets clk_wiz_inst/inst/clk_in1_clk_wiz_0] # Added this line to avoid error in implementation
set_property C_CLK_INPUT_FREQ_HZ 300000000 [get_debug_cores dbg_hub]
set_property C_ENABLE_CLK_DIVIDER false [get_debug_cores dbg_hub]
set_property C_USER_SCAN_CHAIN 1 [get_debug_cores dbg_hub]
connect_debug_port dbg_hub/clk [get_nets clk]

I couldn't change these line because I couldn't understand them but they are linked with the debugging process I think.

Then I did Synthesis, Implementation and Generated Bit stream. Then I programmed the device through JTAG and using SPI flash and using the memory part found in its reference manual. The device programmed successfully after sometime as shown in the LED on my device.

Then I installed Ubuntu 22.04.1 in Oracle VM VirtualBox for the software part. I followed the instructions given in the readme.md document to install prerequisites, compiler for RISC-V and ztachip, OpenOCD and GDB Debugger.

But when I launch OpenOCD as instruction I encounter the following error, although OpenOCD still runs:
InkedScreenshot from 2023-04-01 19-39-41
I didn't change anything in the configuration files being used. I couldn't understand them and I thought may be it'll work. If there is any change required in them for my device then kindly do let me know please.

Then I still launched GDB Debugger because OpenOCD was listening on port 3333 for GDB connections. When I entered monitor reset halt command it again gave me the same error:
InkedScreenshot from 2023-04-01 19-42-46

After the program loaded the connection closed:
InkedScreenshot from 2023-04-01 19-43-37

This is the view on the OpenOCD terminal:
InkedScreenshot from 2023-04-01 19-43-56

Kindly help me out in this issue. Thank you!

About Video Output

Hello, I compiled and wrote the program according to the document, and successfully used GDB for communication and program loading, however, I have not been able to get the output of the video screen, i'm not sure if it's the video output or if the program isn't working properly

Video Output got stuck

capture

After executing the program with the "continue" command, I observed that the output displayed on the monitor remains static for only one second. Subsequently, it becomes unresponsive and does not update, persisting in a frozen state even after pressing button 0.

About LUTs

Hello, I encountered a problem while trying to deploy the project.
When I run implementation in Vivado, it says the LUTs over-utilizad. I wonder why would that happend as I used the same FPGA board as mentioned in the readme.

Query Regarding Porting to following FPGA's

Hey there!
Can we port ztachip to the following FPGAs with less than 100,000 LUTs:

  1. Digilent Cmod A7-35T: Artix-7 with 35,000 LUTs
  2. Cmod A7-15T: Artix-7 FPGA Module with 15,000 LUTs
  3. CrossLink NX from Lattice Semiconductors (LIFCL-40, 39,000 LUTs)

Please let me know any steps to follow

Some questions about simulation

Hello! I am a graduate student and I would like to delve deeper into the design of your code in the main.m file. There are some parts of the logic that are not very clear to me, and I would like to ask you for some guidance on how to gain a deeper understanding of your design. Where should I start if I want to understand your design more thoroughly?

if(c_len` > 0) {
      > DTYPE(INT16) PCORE[*].root.constant[0:c_len-1] <= DTYPE(INT16)MEM((uint32_t)c_p)[0:c_len-1];
   }
   ZTAM_GREG(0,REG_DP_VM_TOGGLE,0)=0;

   // Set pcore process0's constant memory space.
   if(c_len > 0) {
      > DTYPE(INT16) PCORE[*].root.constant[0:c_len-1] <= DTYPE(INT16)MEM((uint32_t)c_p)[0:c_len-1];
   }
ZTAM_GREG
   // Set pcore code space
   > PROG((pcoreLen>>1)) <= DTYPE(INT16)MEM((uint32_t)pcore_p,(pcoreLen>>1)<<2)[:];
   > BARRIER;

Deploying Custom model

I want to deploy my custom model on ztachip. Could you assist me with the deployment process or point me towards documentation?

About Benchmarking

Hi Vuong,

It's me again from issues #6 and #5. I got back to this project recently and was revising everything.

I wanted to ask about measuring the performance of ztachip. My main goal is just to see/show how well ztachip is performing with AI workloads as compared to any other hardware architecture, may it be standalone Vexriscv or Nvidia GPU.

As you mentioned in the readme file:

Acceleration provided by ztachip can be up to 20-50x compared with a non-accelerated RISCV implementation on many vision/AI tasks. ztachip performs also better when compared with a RISCV that is equipped with vector extension.

And in the overview file:

Using the popular MobileNet-SSD AI model as a reference point, ztachip achieves a performance of 10fps at 20GOPS of hardware computing resources. Compared with Nvidia Jetson Nano, it has a performance of 40fps but with a computing hardware resource at 500GOPS. Therefore ztachip has a 6x better computing resource utilization than Nvidia in this case, resulting in much lower power consumption.

I simply want to prove these results again, if possible. Or in any way generate results that are comparable with other hardware architectures and outrun them. It would be great to compare it with RISC-V processor with vector extension but it's also great with Nvidia Jetson Nano. I haven't performed benchmarking of any hardware before so can you please guide me in this?

PS: I am still using that older version of the project, which is working fine, and not this newer one with micropython support. Although I tried running the latest one but it wasn't working, maybe because I made some mistake while porting.

Regards,
Faizan

compilation failed "final link failed: bad value"

Hi,
I tried building ztachip and got a final link failed error. I followed exactly the steps outlined, including fresh build of RISCV gcc. Here is part of the log:
/opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/bin/ld: warning: build/ztachip.elf has a LOAD segment with RWX permissions /opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/bin/ld: /opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/lib/libstdc++.a(eh_globals.o): in function _GLOBAL__sub_I___cxa_get_globals_fast':
eh_globals.cc:(.text.startup._GLOBAL__sub_I___cxa_get_globals_fast+0x0): undefined reference to __dso_handle' /opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/bin/ld: eh_globals.cc:(.text.startup._GLOBAL__sub_I___cxa_get_globals_fast+0xc): undefined reference to __dso_handle'
/opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/bin/ld: build/ztachip.elf: hidden symbol __dso_handle' isn't defined /opt/riscv/lib/gcc/riscv32-unknown-elf/12.2.0/../../../../riscv32-unknown-elf/bin/ld: final link failed: bad value collect2: error: ld returned 1 exit status make: *** [makefile:118: build/ztachip.elf] Error 1

Might you have any idea what's wrong? Thanks a lot.
Regards,

Is it possible to increase the video resolution to full HD?

Your project looks amazing. I would like to try it out. But I have a camera with full HD HDMI output. I could convert the HDMI input to axi stream, but looks like the resolution of input video is fixed in the project both in HW and SW to 640*480. Is it possible to increase the resolution to 1920 x 1080?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.