GithubHelp home page GithubHelp logo

microsoft / analysis-services Goto Github PK

View Code? Open in Web Editor NEW
562.0 73.0 396.0 51.75 MB

Git repo for Analysis Services samples and community projects

License: MIT License

C# 90.56% Batchfile 0.07% PowerShell 4.05% TypeScript 3.04% JavaScript 0.09% CSS 0.23% HTML 0.36% Python 0.83% TSQL 0.77%

analysis-services's Introduction

Analysis Services

Git repo for Analysis Services samples and community projects

A customizeable QPU AutoScale solution for AAS that supports scale up/down as well as in/out

ALM Toolkit is a schema diff tool for tabular models

Automated partition management of Analysis Services tabular models

Real-time monitoring of Analysis Services memory usage broken out by database

The ASTrace tool captures a Profiler trace and writes it to a SQL Server table without requiring a GUI

The AsXEventSample sample shows how to collect streaming xEvents

Sample U-SQL scripts that demonstrate how to process a TPC-DS data set in Azure Data Lake.

Python script to reassemble job graph events from Analysis Services.

A curated set of rules covering best practices for tabular model performance and design which can be run from Tabular Editor's Best Practice Analyzer.

Metadata Translator can translate the names, descriptions, and display folders of the metadata objects in a semantic model by using Azure Cognitive Services.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

analysis-services's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

analysis-services's Issues

Cannot connect to Power BI dataset

I am using v5.0.5 on W10. I have

  • enabled XML read/write on the Premium P1 capacity
  • enabled in pbd the enhanced metatdata option
  • published a dataset to a workspace in premium capacitiy
  • copied the xmla url from the service
    When I attempt to connect, I can authenticate, I can select in the dropdown list box the target dataset, but then I get this error: an internal error has occured
    Any idea what I could do to solve this ?
    Image 386

Please provide support for Power BI pbix files

We are looking to merge multiple pbix reports into a single Analysis Service model and want to use the compare tool to create a central model.

Ideally list the power bi 'servers' running and the databases
image

image

M query template: Error with join/merged queries?

Hi,

(Task: SCD history table might be missing some data historically. Use today's version of that table to supply missing bits. Solution: join history to current, and perform COALESCE)

How does one incorporate a scenario with multiple M queries where they are join/merge/expanded, in the TemplateQuery?

I constructed this in visual studio, where it/they load the resulting table data just fine.

But I'm having trouble successfully running the same in the automated partitioning client. When I just put in the "main" query M code, partitioning client fails with "don't know what that child table reference is".

And I'm not having any luck including both M queries in the TemplateQuery column - syntax errors of various sorts. Maybe it's not even possible to have "concatenated" queries in M.

Any suggestions? Am I just approaching the situation wrongly? I know I could perform this in DAX, but I was hoping to do it in M.

thanks,
seedjay

Astrace on SQL SSAS 2016 installation

Hi,

I compiled the latest version of ASTrace from Codeplex and deployed it into SQL instance and when I started it It throws this error:

24/02/2018 01:02:18: Cannot start Analysis Server trace:

24/02/2018 01:02:18: Analysis Server name: 'SSAS Server Name'

24/02/2018 01:02:18: Trace definition : 'C:\ASTrace2016\Extended.tdf'

24/02/2018 01:02:18: Error: Failed to initialize object as reader.

24/02/2018 01:02:18: Stack Trace: at Microsoft.SqlServer.Management.Trace.TraceServer.InitializeAsReader(ConnectionInfoBase serverConnInfo, String profileFileName)

at ASTrace.Trace.ConnectOlap(TraceServer& traceServer, String SSASserver) in \Path\ASTrace CS - 2016\ASTrace\Service1.cs:line 124

INNER EXCEPTION: Object reference not set to an instance of an object.

at Microsoft.SqlServer.Management.Trace.TraceServer.InitializeAsReader(ConnectionInfoBase serverConnInfo, String profileFileName)

Any Clue!!

Regards
Rajaniesh

Error during Initial Setup of Partitions

Hi ,

Below is the error when we try to run the AsPartitioningProcessing Solution

we have already setup all the parameter tables which are required and the application is connecting to the Source (OLAP) and target (SSAS) servers.

Solution is identifying the tables and when it tries to create the partitions below is the error.

Sequentially process 2017-07 /DataOnly

Exception message: Failed to save modifications to the server. Error returned: 'Syntax error in partition '201707' in table 'Stock_Lifting'. Token Semicolon expected. Start position: (1, 83). End position (1, 84).

Technical Details:
RootActivityId: 489f8910-e719-4a5f-b658-ede192464679
Date (UTC): 2/6/2019 11:52:37 AM
'.

Please guide us and it will be very helpful

Thanks In Advance

Regards
Mohan

I can't open AsPartitionProcessing.AdventureWorks.smproj

I tried SQL SERVER 2017, SSDT for VS 2015 and SQL SERVER 2017, SSDT for VS 2017 combinations.

The error message:
An error occurred while opening the model on the workspace database. Reason: Input string was not in a correct format.

Call Stack:

at Microsoft.AnalysisServices.VSHost.VSHostManager.PrepareSandbox(Boolean newProject, Boolean& isRefreshNeeded, Boolean& isImpersonationChanged, Boolean& saveRequired, List`1& truncatedTables, Boolean isRealTimeMode, Int32 clientCompatibilityLevel)
at Microsoft.AnalysisServices.VSHost.Integration.EditorFactory.CreateEditorInstance(UInt32 grfCreateDoc, String pszMkDocument, String pszPhysicalView, IVsHierarchy pvHier, UInt32 itemid, IntPtr punkDocDataExisting, IntPtr& ppunkDocView, IntPtr& ppunkDocData, String& pbstrEditorCaption, Guid& pguidCmdUI, Int32& pgrfCDW)

BismNormalizer CLI - Unable to locate EnvDTE assembly

Attempting to use BismNormalizer CLI as part of an azure devops build pipeline on a microsoft hosted VS2017 agent. The extension installs without error using vsixinstaller.exe within the build pipeline. During execution of the BismNormalizer.exe, however, it errors saying it cannot find the file or assembly EnvDTE. I have confirmed that EnvDTE is present on the build server.

Unsure if I'm on the right path here... but best information i've found indicates that a few years back there were breaking changes for visual studio extensions which revolve around public / global assemblies. https://docs.microsoft.com/en-us/visualstudio/extensibility/breaking-changes-2017?view=vs-2019

Using BISM Normalizer 4.0.1.1

`C:\Users\VssAdministrator\AppData\Local\Microsoft\VisualStudio\15.0_e2c13fd7\Extensions\jrxtqxjp.4qq\BismNormalizer.exe
About to deserialize D:\a\2\s\Semantic Layer\POC\Unit Test Framework\TabularCompare.bsmn

Source Project File: $/Semantic Layer/Optimized Models/Member/Model.smproj
Target Project: Composite

--Comparing ...
The following exception occurred:
System.IO.FileNotFoundException: Could not load file or assembly 'EnvDTE, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified.
File name: 'EnvDTE, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'
at BismNormalizer.TabularCompare.ConnectionInfo.InitializeCompatibilityLevel(Boolean closedBimFile)
at BismNormalizer.TabularCompare.ComparisonInfo.InitializeCompatibilityLevels()
at BismNormalizer.CommandLine.Program.Main(String[] args)

WRN: Assembly binding logging is turned OFF.
To enable assembly bind failure logging, set the registry value [HKLM\Software\Microsoft\Fusion!EnableLog] (DWORD) to 1.
Note: There is some performance penalty associated with assembly bind failure logging.
To turn this feature off, remove the registry value [HKLM\Software\Microsoft\Fusion!EnableLog].`

Creating Objects from Source to Target results in "unknown" Data Type

Hi,

here is what I did:

  • use v5.0.3 on Win 10
  • loaded .pbit as source
  • loaded .bim as target
  • created objects from source to target

After creating all objects the tool still indicates a lot of "update". When checking in detail, I see dataType in target not being set as in source but to "unknown" instead. A second run of "Validation" and "Update" is not solving it.

Screen:
image

The type or namespace name 'MPartitionSource' could not be found (are you missing a using directive or an assembly reference?)

Hi

I'm trying to get rid of this error with no success. I already tried a new VM installing VS 2017, SSDT 2017 from zero but no success.

I also tried installing the AMO from SQL Server 2016 Feature Pack in another machine (Since i can't install it on a machine with SSDT 2017) and copying the Microsoft.AnalysisServices.DLL installed on the C:\Program Files (x86)\Microsoft SQL Server\130\SDK path but no success also.

Could you please give me some guidance on how to get this namespace working properly on my project?

Thank you very much.
Renan Ribeiro

Unclear on which *direction* actions like Update/Delete etc go?

Hi,

I've been using that other dude's Tabular Editor to make improvements to a deployed tabular model. So now the visual studio bism is behind the good server-deployed model.

I'm hoping to use normalizer to catch-up the visual studio bism, but when I go thru the list of differently-defined objects, I'm not sure what "update" action means.

What I want are the project bism objects to be updated with the definitions that are on the server. How can i do this/convince myself that's actually the direction the update is going?

I don't want to lose my (deployed) work by "updating" the good server version with the obsolete VS project one!

thanks!

Does Azure Analysis Servcie support retrieving metadata automatically of a Tabular model?

Hi Microsoft geeks,

I'm writing to verify that, is it possible to extract metadata (e.g., table names, columns names, KPI, measures) from Azure analysis service's data model? I assumed these information is stored in a json format for a tabular data model.

I’ve been digging into related Azure documentation and tried making REST API call this week, but couldn’t get the metadata I need and not clear on which method/operation to use. Can you please clarify whether/which of your API/scripting languages documentations below support this use case (read metadata)? Or how to use them in combinations to solve this problem?

Asynchronous refresh with the REST API
Azure Analysis Services REST API
Tabular Model Scripting Language
Microsoft.AnalysisServices.Tabular Namespace

Thank you,

Best,
Subi

[email protected]
[email protected]

Connection to Azure Analysis service fails if multi-factor is enabled on account

I have created Azure Analysis Service instance with access to given to account from Azure Active Directory. I can connect successfully using a machine on domain with either "Active Directory - with Password" or "Active Directory - Universal with MFA Support". However if set the configuration to integrated auth I would get error below.

Exception occurred: 08:21:47 PM Exception message: Exception has been thrown by the target of an invocation. Inner exception message: password_required_for_managed_user: Password is required for managed user

and when disable Integrated Auth and provide username and password through command prompt then I would get this error.
Exception occurred: 07:55:40 PM Exception message: Exception has been thrown by the target of an invocation. Inner exception message: AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '4ac7d521-0382-477b-b0f8-7e1d95f85ca2'. Trace ID: 7a724497-67e9-492f-b60c-0a5945c63000 Correlation ID: 639928ef-c8d8-4a63-8ce0-15e8b26e65af Timestamp: 2018-09-21 19:55:40Z
Is it possible to provide support for multi-factor account or should we setup account without one.

Monthly Partition Overlap

Hi there

When I am implementing this on the Monthly partition I am getting an overlap on the months, because currently it has it as shown below

image

Any idea how to get this resolved? As it stands I would need to manually make the change and have to reprocess the entire cube.

Feature request: Ignore white space option

A great feature would be to ignore formatting when comparing code. If someone updated only the formatting of the code, the whole section gets highlighted as a change.

I would like to ignore the formatting and be able to see only real changes to the code.

Missing assembly

Hi,

After updating client libraries (AMO, ADMOD, OLEDB) from 14 to 15 AsPartitionPrcessing.SampleClient stoped working with error

Could not load file or assembly 'Microsoft.AnalysisServices.Tabular, Version=14.
0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependenc
ies. The located assembly's manifest definition does not match the assembly refe
rence. (Exception from HRESULT: 0x80131040)

Strange thing is that when i updated the drivers on dev server the drivers are available in version 13, 14 and 15, but when i do that on test server i have only version 13 and 15. Tried to change the version in AsPartitionProcessing.SampleClient.exe.config file from 14 to 15 but that didn't work.

Any workaround? Or permanent fix for using leatest drivers?

BISM Normalizer with Compatibility level 1103 not working

I'm trying to compare two 1103 models and receive an error "object reference not set to an instance of an object"

Upgrading to 1200 and comparing works great but I do need to keep the compatibility level at 1103. I have tried to compare project to project, project to database on SQL Server 2012 SP2 and two databases on the server and I receive the same error.

I'm using the latest version of BISM normalizer (5.0.0.5) and tested on both VS2017 and VS2019.

More detailed instruction on how to get SampleClient to work with Azure function

Hi,

First of all I want to just say that this solution is awesome! I have done every example in the whitepaper and got everything to work locally. Now it is time to get the Sample client to work in an Azure Function, but I find that there is not much detail about how to go about that. Many don't have coding background and it would help alot if there was an step by step for getting the SampleClient up and running in an Azure Function. I don't even know where to start. Can I just copy and paste the code from SampleClient to an Azure function?

I have followed the other blogs, (https://azure.microsoft.com/en-us/blog/azure-as-automated-partition-management/?v=17.23h
https://sqldusty.com/2017/06/21/how-to-automate-processing-of-azure-analysis-services-models/)
that have instructions for how to process the Azure Analysis services through an Azure Function and got that to work. But I want to get the SampleClient solution to work through Azure function.

If anyone can point me in the right direction it would be very helpful.

Execute AsPartitionProcessing from SSIS script task

The documentation states that the sample "Can be leveraged in many ways including from an SSIS script task". Is there sample code where you can run the AsPartitionProcessing process using an SSIS script task?

The only way that I was able to run the AsPartitionProcessing process in SSIS is by invoking the "AsPartitionProcessing.SampleClient.exe" file. However, running an exe in SSIS has limitations such as error handling. For example, the Tabular might have errors (which are logged in the database) but the SSIS task thinks that it ran successfully.

It would be nice to have a sample code where you can run the AsPartitionProcessing process using an SSIS script task.

BISM Normalizer - Log of Changes?

Hi - does BISM Normalizer have a logging feature when all changes that are applied during a target update operation are saved?

Something like the "Report Differences" feature, which generates and Excel file summary of the comparison. This is useful, but has to be saved manually and does not necessarily represent the final changes that were applied to the target.

Thanks for any help.

Invoke-ProcessASDatabase database compatibility level of 1500 is below the minimal compatability level of 2147483647

Using Invoke-ProcessASDatabase within a Runbook to automate refresh of an Azure Tabular Model and receive this error relating to using Calculation Groups in the model requiring higher compatibility level.

"...database compatibility level of 1500 is below the minimal compatability level of 2147483647..."

The model is compatibility level 1500 and I can process the model in full quite happily through SSMS. Presuming the 2147483647 isn't a real compatibility level! Some default upper limit being thrown back by the cmdlet? Is this fixable with any switches or do I need to look for an alternative approach?

Ta

No scrollbars in object definition windows

The source/target object definition windows lost their scrollbars with latest update, in Visual Studio/SSDT 2017. Vertical scrolling still works with the mouse wheel, but no scrollbars are visible. The target definition window will scroll horizontally in response to keyboard navigation (arrow keys, home/end), but the source definition window will not scroll horizontally at all.

Mixed Granularity Question

I have some difficulty grasping the Rolling Window setup in combination with a Mixed Granularity.

What i want:

  • Data from years previous to 2018 (<= 2017) in a Yearly granularity
  • Data from the months prior to the current month (< currentmonth) in a Monthly granularity
  • Data from the current month in a daily granularity

I want only the last 7 days to be incrementally processed daily. All other partitions will be refreshed weekly.

My current setup is like this:
Model
image
PartitionConfiguration
image

This setup returns an error because the dates overlap, which is 'correct'. This is due to the fact that the number of partitions isn't dynamically calculated.
I tried another setup with limiting the number of fullpartitions (21), which gave me the right configuration yesterday (2018-08-21), but today (2018-08-22) when i ran the script again this morning the partition of the first day of this month (20180801) was removed because it fell out of scope. (it was 'the 22nd partition')

Am I trying to do something that isn't supported out of the box, should I try another approach with the configuration or look at it in a completly diffrent way?

I expected the partition for the first day of the month to remain and a new partition for this day to be created.

Could you help me understand more about the configuration and tell me if it's possible to achive what I want. Thank you in advance.

RestApiSample and PostAsJsonAsync

FYI...I needed to add the package Microsoft.AspNet.WebApi.Client in order for System.Net.Http.Formatting and PostAsJsonAsync to not complain.

Permission Violation during execution of code

Hi Christian,

While executing the code after providing/updating all config details and Partition Query in the Config file and Program.cs files the control of the code is lost in between and it goes to another cube database who's information is not provided any where in the code we found this with Profiler and getting the following error: [Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: Syntax error, permission violation, or other nonspecific error; 42000] This happens when multiple cube database are on the server.
Please help in solving this issue.

Extend to provide Process Add frequency withing a partition

I'd like it to support process add and mixed partitioning... e.g. have a Month partitions but in the latest month have a process add frequency so I can process add days until the next month turns over. Am happy to extend... how can I contribute?

ALM Toolkit - Report Differences error opening Excel 64-bit

In ALM Toolkit version 5.0.1, when I click the Report Differences button, I receive:

Unable to cast COM object of type 'Microsoft.Office.Interop.Excel.ApplicationClass' to interface type 'Microsoft.Office.Interop.Excel._Application'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{000208D5-0000-0000-C000-000000000046}' failed due to the following error: Error loading type library/DLL. (Exception from HRESULT: 0x80029C4A (TYPE_E_CANTLOADLIBRARY)).

I am using Excel Version 1902 (Build 11328.20480 Click-to-Run) 64-bit

Failed to save modifications to the server

Hello Christian,

I've been testing ALM toolkit and it seems to be corrupting the power bi file.

  1. Created a model in power bi. Used parameters to select a data range. Defined incremental policy.
  2. Published to Service, premium workspace.
  3. Connected to Premium workspace and compared dataset with the power bi file ("Power BI Desktop" option), as per picture below.

ALM connection

  1. Added a measure in desktop and deployed new components to the dataset in Power BI Service.

Then I carried on with my testing and wanted to add a new table (on the power bi file), but after I clicked "Close and Apply" in Power Query I got the following error:

Load error

I've seen some people coming across the same issue:
https://community.powerbi.com/t5/Desktop/Failed-to-save-modifications-to-the-server-when-calling-API/td-p/990136

The issue is solved if I remove the incremental policy in desktop.

Is this a known issue?

Btw, thank you so much for doing this wonderful tool, it is going to be a game changer for enterprise deployments. :)

Felipe,

No room is available to display rows

Hi there , just encounter this strange error while comparing target with source in Tabular model using BISM normaliser. Do any one knows the reasons why this pops up?

Solution File Corrupt

Not able to extract the solution file "AsPartitionProcessing.sln", it is giving error of solution file not valid when I am trying to open the solution via Visual Studio.

Can you please help with new solution file.

Deploy AW sample tabular model fails...

I'm running sql standard 2017 and VS enterprise 2017.

I restored the included AWDW bak file without incident. Solution builds without incident.

Going into the bim within VS, and going to Table Properties on a table, and clicking Design, results in the following connectivity error:

DataSource.Error: Microsoft SQL: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)
Details:
DataSourceKind=SQL
DataSourcePath=localhost\sp1;AdventureWorksDW
Message=A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)
Number=-1
Class=20

Error makes sense because my server is localhost, not localhost\sp1.

How do I change the DataSourcePath to correctly reflect my server?


Deploying the tabular project also fails with the "error 26 locating server/instance specified" message.

Additional VS warnings spring up:
Severity Code Description Project File Line Suppression State
Warning Found conflicts between different versions of "Microsoft.AnalysisServices.Core" that could not be resolved. These reference conflicts are listed in the build log when log verbosity is set to detailed. AsPartitionProcessing.SampleClient

Severity Code Description Project File Line Suppression State
Warning Found conflicts between different versions of "Microsoft.AnalysisServices.Tabular" that could not be resolved. These reference conflicts are listed in the build log when log verbosity is set to detailed. AsPartitionProcessing.SampleClient

So how do I get from here to success with the sample?

thanks for any tips,
sff

Cannot deploy using BISM Normalizer to AAS

Hi there Christian,

I thought I would let you know that I have been tyring to use the BISM Normalizer to deploy an update to an AAS database.
I ran query profiler and I got the following error message which I thought would be good for you to see.

image

BISM Normalizer Compatibility Issue

Hi All,

I have two models with a compatibility level of 1465. I cannot merge changes from one model to the other due to the Normalizer's version.

The specific error that returns is:

bism normalizer

There is no purchase page on the BISM Normalizer website.

Any ideas on how I could get this compatibility level to work on the version that is out there...Which is 4.0.0.30?

Thanks

Eddie

level of 1500 is below the minimal compatability level of 2147483647

I've recently installed SSAS 2019 and deployed few tabular models using the 1500 compatibility.

When I try to process the db from SSMS I get the following error message:

"The database compatibility level of 1500 is below the minimal compatability level of 2147483647 needed..."

My SSAS version 15.0.32.52

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.