Name: Arash A. Sabet
Type: User
Company: Umplify Technologies Inc.
Bio: Entrepreneur and enthusiast in Startups. Technology. Raptors fan.
Twitter: ArashSabet
Location: Toronto,, Canada
Blog: https://umplify.com
Arash A. Sabet's Projects
Cross Platform C# web crawler framework built for speed and flexibility. Please star this project! +1.
Run Azure DevOps Agent in Kubernetes using KEDA
Created with StackBlitz ⚡️
:technologist: My personal technology weblog
It is a web-based DMS application that facilitates the management of automobile dealership increasing its efficiency and makes management much easier.
☁️ Azure Service Bus service issue tracking and samples
Azure SignalR Service SDK for .NET
High performance, distributed unique thread-safe id generator for Azure.
A simple web crawler, using Abot, that indexes page contents into Azure Search.
Code samples for Azure SignalR
Profile repository
Landing website + Blog using Jekyll & Tailwind CSS
The purpose of this repository is to demonstrate DbContext registration issue in Azure Functions V2
🃏 A magical documentation site generator.
An unambitious, simple and fluent implementation of tasks pipeline in .NET
For empowering community 🌱
Export GitHub Project cards into CSV files
An easy way to perform background job processing in your .NET and .NET Core applications. No Windows Service or separate process required
:dragon: Product marketing template for Jekyll
📚IPFS documentation platform
Serif is a beautiful business theme for Jekyll.
:triangular_ruler: Jekyll theme for building a personal site, blog, project documentation, or portfolio.
A self-describing metadata format for IPFS (Inter Planetary File System) entities for .Net (C#, VB, F# etc.)
Web Crawler/Spider for NodeJS + server-side jQuery ;-)
The Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. This project provides an easy-to-use class, implemented in C#, to work with robots.txt files.