GithubHelp home page GithubHelp logo

chillicream / graphql-platform Goto Github PK

View Code? Open in Web Editor NEW
4.9K 79.0 710.0 767.47 MB

Welcome to the home of the Hot Chocolate GraphQL server for .NET, the Strawberry Shake GraphQL client for .NET and Banana Cake Pop the awesome Monaco based GraphQL IDE.

Home Page: https://chillicream.com

License: MIT License

C# 97.77% HTML 0.53% JavaScript 0.13% TypeScript 1.47% Shell 0.02% PowerShell 0.02% Batchfile 0.01% Open Policy Agent 0.01% F# 0.06% Smalltalk 0.01%
graphql dotnet dotnet-core c-sharp graphql-server graphql-syntax resolver facebook asp-net-core asp-net

graphql-platform's Introduction

ChilliCream GraphQL Platform

NuGet Package License Slack channel Twitter


ChilliCream GraphQL Platform

Welcome to the ChilliCream GraphQL Platform!

We help you, developers and companies, to leverage your APIs to the next level with GraphQL. Strongly-typed schemas that match your APIs 100 percent. Efficient data fetching that reduces overall cost without extra effort. Consumer-friendly, declarative, self documented APIs that support you in your daily work to build powerful UIs effectively.

Most of our products are open-source and right here in this repository.

Everyone is welcome! Always remember to treat anyone respectful, no matter their gender, opinion, religion, or skin-tone. We're one world and together we're stronger!

Join our awesome community on Slack, if you would like to get in touch with us, need help, or just want to learn!

Our Products

Hot Chocolate

Hot Chocolate is the most efficient, feature-rich, open-source GraphQL server in the .NET ecosystem, that helps developers to build powerful GraphQL APIs and Gateways with ease.

Documentation

Banana Cake Pop

Banana Cake Pop is an awesome, performant, feature-rich GraphQL IDE that helps developers and data scientists to explore, share, and test any GraphQL API.

Banana Cake Pop can be installed as Desktop App, used as Web App, which can be installed through your browser of choice as well, or used as Middleware on your GraphQL endpoint. Middlewares are available in .NET and NodeJS. More middlewares will follow.

Documentation

Strawberry Shake

Strawberry Shake is an incredible, open-source GraphQL client for the .NET ecosystem, that helps developers to build awesome UIs in Blazor, Maui, and more. Other than most .NET GraphQL clients, Strawberry Shake is type-safe GraphQL client that generates .NET types from your GraphQL schema out of the box. Moreover, Strawberry Shake comes with a reactive store like in Relay and Apollo Client, which is nice due to the fact that you can build reactive UIs in .NET with stuff like client-side data caching and data-fetching strategies.

Documentation

Green Donut

Green Donut is a lightweight, yet powerful DataLoader that simplifies batching, caching, and solves the N+1 problem.

Documentation

Roadmap

If you are interested in upcoming releases, check out our Roadmap.

Official examples

Examples of things built on top of the ChilliCream GraphQL Platform that are open source and can be explored by others.

Contributing

Become a code contributor and help us make the ChilliCream GraphQL platform even better!

From our community

Check out what members of our awesome community have made!

Financial Contributors

Become a financial contributor and help us sustain our community.

Sponsor

Become a sponsor and get your logo on our README on Github with a link to your site.

Backer

Become a backer and get your image on our README on Github with a link to your site.

graphql-platform's People

Contributors

a360jmaxxgamer avatar antonc9018 avatar arif-hanif avatar artola avatar benmccallum avatar cajuncoding avatar damikun avatar david-driscoll avatar dependabot-preview[bot] avatar dolfik1 avatar drowhunter avatar fredericbirke avatar glen-84 avatar glucaci avatar gmiserez avatar grounzero avatar huysentruitw avatar jorrit avatar matt-psaltis avatar michaelstaib avatar pascalsenn avatar promontis avatar rstaib avatar sergeyshaykhullin avatar sunghwan2789 avatar tobias-tengler avatar tsinghammer avatar vhatsura avatar willwolfram18 avatar wonbyte avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

graphql-platform's Issues

Cache Compiled Resolvers

The schema creation is slow. This is because we are compiling the resolvers with .net to have a fast resolvers while executing queries.

In order to not always compile new resolvers we should write the resolver assembly to disc and reload it on the next start if it was not modified and we can ensure that all the compiled resolvers are the resolvers that we want.

Refactor FieldResolverKind.cs

I've selected FieldResolverKind.cs for refactoring, which is a module that has 1 instance(s) of code in comments (1). Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Clean Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Apply The Boy Scout Rule and fix "code smells" in the codebase.
  • Why❓
    Clean code is maintainable code.
  • How πŸ”§
    Remove useless comments, commented code blocks, and dead code. Refactor poorly handled exceptions, magic constants, and poorly named units or variables.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

EnumTypeFactory

The EnumTypeFactory has to be implemented in order to support enums for the schema first apis.

Refactor MemberResolver.cs

I've selected MemberResolver.cs for refactoring, which is a module that has 23 instance(s) of code in comments (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23). Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Clean Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Apply The Boy Scout Rule and fix "code smells" in the codebase.
  • Why❓
    Clean code is maintainable code.
  • How πŸ”§
    Remove useless comments, commented code blocks, and dead code. Refactor poorly handled exceptions, magic constants, and poorly named units or variables.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor __Type.Configure(IObjectTypeDescriptor)

I've selected __Type.Configure(IObjectTypeDescriptor) for refactoring, which is a unit of 134 lines of code. Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Short Units of Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the length of code units to 15 lines of code.
  • Why❓
    Small units are easier to analyse, test and reuse.
  • How πŸ”§
    When writing new units, don't let them grow above 15 lines of code. When a unit grows beyond this, split it in smaller units of no longer than 15 lines.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor CharExtensions.CharExtensions()

I've selected CharExtensions.CharExtensions() for refactoring, which is a unit of 59 lines of code. Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Short Units of Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the length of code units to 15 lines of code.
  • Why❓
    Small units are easier to analyse, test and reuse.
  • How πŸ”§
    When writing new units, don't let them grow above 15 lines of code. When a unit grows beyond this, split it in smaller units of no longer than 15 lines.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor OperationExecuter.TryCompleteObjectValue(ExecutionContext, IResolverContext, ImmutableStack, FieldSelection, IType, Path, object, Action)

I've selected OperationExecuter.TryCompleteObjectValue(ExecutionContext, IResolverContext, ImmutableStack, FieldSelection, IType, Path, object, Action) for refactoring, which is a unit of 17 lines of code and 8 parameters. Addressing this will make our codebase more maintainable and improve Better Code Hub's Keep Unit Interfaces Small guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the number of parameters per unit to at most 4.
  • Why❓
    Keeping the number of parameters low makes units easier to understand, test and reuse.
  • How πŸ”§
    Reduce the number of parameters by grouping related parameters into objects. Alternatively, try extracting parts of units that require fewer parameters.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor ExecutionContext.ExecutionContext(Schema, DocumentNode, OperationDefinitionNode, VariableCollection, IServiceProvider, object, object)

I've selected ExecutionContext.ExecutionContext(Schema, DocumentNode, OperationDefinitionNode, VariableCollection, IServiceProvider, object, object) for refactoring, which is a unit of 34 lines of code and 7 parameters. Addressing this will make our codebase more maintainable and improve Better Code Hub's Keep Unit Interfaces Small guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the number of parameters per unit to at most 4.
  • Why❓
    Keeping the number of parameters low makes units easier to understand, test and reuse.
  • How πŸ”§
    Reduce the number of parameters by grouping related parameters into objects. Alternatively, try extracting parts of units that require fewer parameters.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor SyntaxNodeVisitor.ExecuteVisitationMap(ISyntaxNode)

I've selected SyntaxNodeVisitor.ExecuteVisitationMap(ISyntaxNode) for refactoring, which is a unit of 131 lines of code. Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Short Units of Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the length of code units to 15 lines of code.
  • Why❓
    Small units are easier to analyse, test and reuse.
  • How πŸ”§
    When writing new units, don't let them grow above 15 lines of code. When a unit grows beyond this, split it in smaller units of no longer than 15 lines.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Write more tests for the following types

  • /src/Core/Internal/BaseTypes.cs
  • /src/Core/Resolvers/FieldResolverDescriptor.cs
  • /src/Core/Internal/ServiceManager.cs
  • /src/Core/Resolvers/FieldResolver.cs
  • /src/Language/Lexer/Source.cs

Refactor VariableValueResolver.CoerceVariableValues(Schema, OperationDefinitionNode, IReadOnlyDictionary)

I've selected VariableValueResolver.CoerceVariableValues(Schema, OperationDefinitionNode, IReadOnlyDictionary) for refactoring, which is a unit of 54 lines of code. Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Short Units of Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Limit the length of code units to 15 lines of code.
  • Why❓
    Small units are easier to analyse, test and reuse.
  • How πŸ”§
    When writing new units, don't let them grow above 15 lines of code. When a unit grows beyond this, split it in smaller units of no longer than 15 lines.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Refactor TypeFactoryTests.cs

I've selected TypeFactoryTests.cs for refactoring, which is a module that has 1 instance(s) of code in comments (1). Addressing this will make our codebase more maintainable and improve Better Code Hub's Write Clean Code guideline rating! πŸ‘

Here's the gist of this guideline:

  • Definition πŸ“–
    Apply The Boy Scout Rule and fix "code smells" in the codebase.
  • Why❓
    Clean code is maintainable code.
  • How πŸ”§
    Remove useless comments, commented code blocks, and dead code. Refactor poorly handled exceptions, magic constants, and poorly named units or variables.

You can find more info about this guideline in Building Maintainable Software. πŸ“–


ℹ️ To know how many other refactoring candidates need addressing to get a guideline compliant, select some by clicking on the πŸ”² next to them. The risk profile below the candidates signals (βœ…) when it's enough! 🏁


Good luck and happy coding! :shipit: ✨ πŸ’―

Review and fix exceptions and their messages

We have to review all exception types and the messages of the exceptions many of the have typos and other issues. Also we might want to introduce different exception types in many cases.

Throttling

This feature shall provide more security options to the execution engine.

The solutions we’ve seen so far are great to stop abusive queries from taking your servers down. The problem with using them alone like this is that they will stop large queries, but won’t stop clients that are making a lot of medium sized queries!

In most APIs, a simple throttle is used to stop clients from requesting resources too often. GraphQL is a bit special because throttling on the number of requests does not really help us. Even a few queries might be too much if they are very large.

In fact, we have no idea what amount of requests is acceptable since they are defined by the clients. So what can we use to throttle clients?

Throttling Based on Server Time

A good estimate of how expensive a query is the server time it needs to complete. We can use this heuristic to throttle queries. With a good knowledge of your system, you can come up with a maximum server time a client can use over a certain time frame.

We also decide on how much server time is added to a client over time. This is a classic leaky bucket algorithm. Note that there are other throttling algorithms out there, but they are out of scope for this chapter. We will use a leaky bucket throttle in the next examples.

Let’s imagine our maximum server time (Bucket Size) allowed is set to 1000ms, that clients gain 100ms of server time per second (Leak Rate) and this mutation:

mutation {
  createPost(input: { title: "GraphQL Security" }) {
    post {
      title
    }
  }
}

takes on average 200ms to complete. In reality, the time may vary but we’ll assume it always takes 200ms` to complete for the sake of this example.

It means that a client calling this operation more than 5 times within 1 second would be blocked until more available server time is added to the client.

After two seconds (100ms is added by second), our client could call the createPost a single time.

As you can see, throttling based on time is a great way to throttle GraphQL queries since complex queries will end up consuming more time meaning you can call them less often, and smaller queries
may be called more often since they will be very fast to compute.

It can be good to express these throttling constraints to clients if your GraphQL API is public. In that case, server time is not always the easiest thing to express to clients, and clients cannot really estimate what time their queries will take without trying them first.

Remember the Max Complexity we talked about earlier? What if we throttled based on that instead?

Throttling Based on Query Complexity

Throttling based on Query Complexity is a great way to work with clients and help them respect the limits of your schema.

Let’s use the same complexity example we used in the Query Complexity section:

query {
  author(id: "abc") {    # complexity: 1
    posts {              # complexity: 1
      title              # complexity: 1
    }
  }
}

We know that this query has a cost 3 based on complexity. Just like a time throttle, we can come up with a maximum cost (Bucket Size) per time a client can use.

With a maximum cost of 9, our clients could run this query only three times, before the leak rate forbids them to query more.

The principles are the same as our time throttle, but now communicating these limits to clients is much nicer. Clients can even calculate the costs of their queries themselves without needing to estimate server time!

The GitHub public API actually uses this approach to throttle their clients. Take a look at how they express these limits to users: https://developer.github.com/v4/guides/resource-limitations/.

Summary

GraphQL is great to use for clients because it gives them so much more power. But that power also gives them the possibility to abuse your GraphQL server with very expensive queries.

There are many approaches to secure your GraphQL server against these queries, but none of them is bullet proof. It’s important to know what options are available and know their limits so we take the best decisions!

https://www.howtographql.com/advanced/4-security/

Custom Schema Directives

Schema directives are directives which can be annotated to the type system e.g. ObjectTypes, Fields etc. All type descriptors should offer a new method to add directives to them.

descriptor.Directive(new FooDirective(a, b, c));

// and

descriptor.Directive<FooDirective>();

Schema directives shall be able to tab into the execution engine and alter the execution behaviour and path.

Ideally I would want an extension point before a resolver was executed and after.

Maybe we have something like the following:

public interface IResolverMiddleware
{
    void OnBeforeInvoke(IResolverContext context);

    object OnAfterInvoke(IResolverContext context, object resolverResult);
}

Also the type system objects should provide access to their directives. So, that even when a directive does not implement a resolver middleware the resolver itself can tab into the resolvers.

context.Field.Directives.First();

Schema directives will be created on schema creation and are from that point on immutable.

Query directives on the other hand can be annotated differently in each query.

{
   foo @customDirective(myargs: 123)
}

URL Scalar

The URL scalar is not a type of the specification. Nevertheless, the specification mentions it as a "potentially useful custom scalar".

We agree with this and want to provide this typ as one of our extended built-in type.

The URL type shall serialise as string and the hot chocolate GraphQL server guarantees that it is a valid URL. The native type representation should be System.Uri and we should provide a converter also have a System.String representation.

Query Complexity

This feature shall provide more security options to the execution engine.

Sometimes, the depth of a query is not enough to truly know how large or expensive a GraphQL query will be. In a lot of cases, certain fields in our schema are known to be more complex to compute than others.
Query complexity allows you to define how complex these fields are, and to restrict queries with a maximum complexity. The idea is to define how complex each field is by using a simple number. A common default is to give each field a complexity of 1. Take this query for example:

query {
  author(id: "abc") { # complexity: 1
    posts {           # complexity: 1
      title           # complexity: 1
    }
  }
}

A simple addition gives us a total of 3 for the complexity of this query. If we were to set a max complexity of 2 on our schema, this query would fail.
What if the posts field is actually much more complex than the author` field? We can set a different complexity to the field. We can even set a different complexity depending on arguments! Let’s take a look at a similar query, where posts has a variable complexity depending on its arguments:

query {
  author(id: "abc") {    # complexity: 1
    posts(first: 5) {    # complexity: 5
      title              # complexity: 1
    }
  }
}

Query Complexity Pros

  • Covers more cases than a simple query depth.
  • Reject queries before executing them by statically analyzing the complexity.

Query Complexity Cons

  • Hard to implement perfectly.
  • If complexity is estimated by developers, how do we keep it up to date? How do we find the costs in the first place?
  • Mutations are hard to estimate. What if they have a side effect that is hard to measure like queuing a background job?

https://www.howtographql.com/advanced/4-security/

Ignore Field

public class HumanType
        : ObjectType<Human>
    {
        protected override void Configure(IObjectTypeDescriptor<Human> descriptor)
        {
            descriptor.Interface<CharacterType>();
            descriptor.Field(t => t.Friends)
                .Type<ListType<CharacterType>>()
                .Resolver(c => CharacterType.GetCharacter(c));
            descriptor.Field(t => t.Height).Ignore();
        }
    }

Time Scalar

We already have a ISO‐8601 based DateTimeType and DateType. In order to complete this set a TimeType containing only the time would be a nice completion.

Seperate result and serialization

At the moment, we have a QueryResult essentially comprised of an OrderedDictionary that represents the data field and a collection of IError representing the errors field.

The scalar values put in the OrderedDictionary is essentially already serialized.

This makes it very easy to create the data structure, but also gets rid much too early of the .net Values.

For example, our long is currently a string in order to give languages like java script a chance to put that one in a correct type.

The idea now is to a IQueryResult that contains the .net values and has a type reference for every scalar value.

public readonly struct LeafValue 
{
   INamedType Type { get; }
   object Value { get; }
}

We then provide a IQueryResultSerializer that basically creates the JSON document or some other transport format.

Also we would then provide an extension method that can convert the IQueryResult into an object for use in .net.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.