GithubHelp home page GithubHelp logo

azure / oav Goto Github PK

View Code? Open in Web Editor NEW
91.0 101.0 53.0 25.75 MB

Tools for validating OpenAPI (Swagger) files.

License: MIT License

TypeScript 95.84% JavaScript 0.33% Handlebars 0.32% Mustache 2.92% Python 0.55% PowerShell 0.04%
openapi swagger request-validation response-validation openapi-validation model-validation semantic-validation spec-resolver azure traffic-validation

oav's Introduction

openapi-validation-tools [oav]

Package version

Build Status code style: prettier

Regression: Build Status How to fix this

Tools for validating OpenAPI (Swagger) files.

Requirements

  • node.js version >= 18.x

You can install the latest stable release of node.js from here. For a machine with a linux flavored OS, please follow the node.js installation instructions over here

How to install the tool

npm install -g oav@latest

Command usage:

$ oav -h    Commands:
  analyze-dependency                        analyze swagger resource type
                                            dependency.
  analyze-report <newman-report-path>       analyze report. default format:
                                            newman json report
  example-quality <spec-path>               Performs example quality validation
                                            of x-ms-examples and examples
                                            present in the spec.
  extract-xmsexamples <spec-path>           Extracts the x-ms-examples for a
  <recordings>                              given swagger from the .NET session
                                            recordings and saves them in a file.
  generate-collection                       Generate postman collection file
                                            from API scenario.
  generate-examples [spec-path]             Generate swagger examples from real
                                            payload records.
  generate-report [raw-report-path]         Generate report from postman report.
  generate-api-scenario                     Generate swagger examples from real
                                            payload records.
  generate-static-api-scenario              Generate API-scenario from swagger.
  run-api-scenario <api-scenario>           newman runner run API scenario
                                            file.                 [aliases: run]
  validate-example <spec-path>              Performs validation of x-ms-examples
                                            and examples present in the spec.
  validate-spec <spec-path>                 Performs semantic validation of the
                                            spec.
  validate-traffic <traffic-path>           Validate traffic payload against the
  <spec-path>                               spec.
  traffic-convert <input-dir>               Showcase what it would look like to 
  <output-dir>                              transform a directory full of 
                                            [azure-sdk/test-proxy](https://github.com/Azure/azure-sdk-tools/tree/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy) 
                                            recordings files into traffic payloads 
                                            consumable by traffic validation command in oav
  

Options:
  --version          Show version number                               [boolean]
  -l, --logLevel     Set the logging level for console.
  [choices: "off", "json", "error", "warn", "info", "verbose", "debug", "silly"]
                                                               [default: "info"]
  -f, --logFilepath  Set the log file path. It must be an absolute filepath. By
                     default the logs will stored in a timestamp based log file
                     at "/home/ruowan/oav_output".
  -p, --pretty       Pretty print
  -h, --help         Show help                                         [boolean]

What does the tool do? What issues does the tool catch?

  • Semantic validation Semantic validation enforces correctness on the swagger specific elements. Such as paths and operations. Ensure the element definition meet the OpenApi 2.0 specification.

  • Model validation Model validation enforces correctness between example and swagger. It checks whether definitions for request parameters and responses, match an expected input/output payload of the service.

    Examples of issues detected:

    • Required properties not sent in requests or responses
    • Defined types not matching the value provided in the payload
    • Constraints on properties not met
    • Enumeration values that don’t match the value used by the service.

    Model validation requires example payloads (request/response) of the service, so the data can be matched with the defined models. See x-ms-examples extension on how to specify the examples/payloads. Swagger “examples” is also supported and data included there is validated as well. To get the most benefit from this tool, make sure to have the simplest and most complex examples possible as part of x-ms-examples.

    • Please take a look at the redis-cache swagger spec as an example for providing "x-ms-examples" over here.

    • The examples need to be provided in a separate file in the examples directory under the api-version directory azure-rest-api-specs/arm-<yourService>/<api-version>/examples/<exampleName>.json. You can take a look over here for the structure of examples.

    • We require you to provide us a minimum (just required properties/parameters of the request/response) and a maximum (full blown) example. Feel free to provide more examples as deemed necessary.

    • We have provided schemas for examples to be provided in the examples directory. It can be found over here. This will help you with intellisense and validation.

    • If you are using vscode to edit your swaggers in the azure-rest-api-specs repo then everything should work out of the box as the schemas have been added in the .vscode/settings.json file over here.

    • If you are using Visual Studio then you can use the urls provided in the settings.json file and put them in the drop down list at the top of a json file when the file is opened in VS.

How does this tool fit with others

Swagger specs validation could be split in the following:

  1. Schema validation
  2. Semantic validation
  3. Model definition validation
  4. Swagger operations execution (against mocked data or live tests)
  5. Human-eye review to complement the above

In the context of “azure-rest-api-specs” repo:

  • #1 is being performed on every PR as part of CI.
  • #2 and #3 are performed by the tool currently in openapi-validation-tools repo and by AutoRest linter. We’re working towards integrating them into CI for “azure-rest-api-specs” repo.
  • #4 is not available yet, though we’re starting to work on it.
  • #5 will be done by the approvers of PRs in “azure-rest-api-specs”, as this won’t be automated.

Run API test

OAV support run API test against Azure and validate request and response. You could define API scenario file which compose with several swagger example files and then use oav to execute it. For more details about API test, please refer to this API scenario documentation.

Live Validation Mode

  • A Live Validation mode has been added to OAV with the purpose of enabling validation of live traffic.
  • Usage (here is a sample of a request-response pair):
const liveValidatorOptions = {
  git: {
    url: "https://github.com/Azure/azure-rest-api-specs.git",
    shouldClone: true,
  },
  directory: path.resolve(os.homedir(), "cloneRepo"),
  swaggerPathsPattern: "/specification/**/resource-manager/**/*.json",
  isPathCaseSensitive: false,
  shouldModelImplicitDefaultResponse: true,
};

const apiValidator = new oav.LiveValidator(liveValidatorOptions);
await apiValidator.initialize(); // Note that for a large number of specs this can take some time.

// After `initialize()` finishes we are ready to validate
const validationResult = apiValidator.validateLiveRequestResponse(requestResponsePair);

Regression testing

Output of the OAV tool has been snapshotted and committed to the repo. The regression test may be run on a sample or all of https://github.com/azure/azure-rest-api-specs. If there are changes to the snapshots the build produces a git patch file as an artifact which may be used to update the snapshots.

Fast Regression (~10mins) is used for merge validation

Slow Regression (~1 hour) is run after merge and should be fixed if it fails

Fixing regression builds

  1. Go to the failed build
  2. Download the artifact patch file
  3. In the OAV directory run git apply <path to patch file>
  4. Commit the patched changes and create a pull request
  5. Validate that the changes look ok and don't represent a breaking change in OAV
  6. Merge the PR

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

oav's People

Contributors

amarzavery avatar billytrend avatar dependabot[bot] avatar descatles avatar eddyashton avatar ericwill-msft avatar jacktn avatar jianyexi avatar keryul avatar leni-msft avatar ligengxin96 avatar lirenhe avatar marstr avatar mcardosos avatar mikeharder avatar mikekistler avatar ms-zhenhua avatar nickzhums avatar raych1 avatar ruowan avatar salameer avatar sarangan12 avatar scbedd avatar sergey-shandar avatar tianxchen-ms avatar veronicagg avatar vishrutshah avatar wwendyc avatar xiaoxuqi-ms avatar zhenglaizhang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

oav's Issues

Introduce different levels of results reporting

Allow choosing verbose logging versus not, errors or all checks in output. Current output, with errors and then "swagger structured" output with passes and failures, is quite verbose and duplicated.

Enable live testing of swagger documents, part 1

  • Authenticate
  • Make request per request body response
  • Validate that response code is one of the listed response codes in the example
  • Validate that the response matches the schema for the response

Integrate this tools into rest-api-specs

Reason it's not there right now:

  • there are lots of existing violations
  • we have warnings coming up in our tools that should not be blocking the CI.

What do we need to do:

  • Differentiate blocking acceptable warnings VS blocking warnings.
  • come up with a plan for existing violations

Run the spec validator on all swagger currently in Master and see what type of issues we find. if they are low hanging fruits then we can tackle them, if not we'll need to reach-out to the owners.

A suggestion here is to include Linter and this validation tool activation on latest rest-api-sec preview -> Master migration proposal. we would not block on the Specs in Preview but there shouldn't be any errors reported from these tools with anything that's in Master

Updating & Standardizing the top level summary of the JSON output of this tool

in our efforts to standardize the output of our validation tools to:

  • Make it more human readable
  • Make it easier and more performant for potential visualizers to process and also categorize the output according to the tools that were used to process it.
  • Provide useful versioning & performance information

I'm suggesting that we start with standardizing the top level summary of the JSON output of this tool, as follows

{
  json_file: 'C:\Users\sameal\Documents\GitHub\openapi-validation-tools\arm-mediaservices\2015-10-01\swagger\media.json',
  validationTool: 'model_validator',
  validationToolVersion: '0.1.0',
  validationCommand: 'example',
  validationSummary: {
    validityStatus: false,
    numOfOperationsInFile: '50',
    numOfFailedOperations: '23',
    validationTime: '550'
  },
  validation_result: {
    operations: {
      MediaService_CheckNameAvailability: {
        'example-in-spec': {
          parameters: {
            CheckNameAvailabilityInput: {
              isValid: true,
              result: 'The example for body parameter CheckNameAvailabilityInput with reference #/definitions/CheckNameAvailabilityInput in operation "MediaService_CheckNameAvailability" is valid.'
            }
          },
          responses: {
            '200': {
              isValid: true,
              result: 'The example for response with statusCode "200" in operation "MediaService_CheckNameAvailability" is valid.'
            }
          }
        }
      }
    }
  }
}

Implement Promises inside the tool

  • make it simple inside the spec validator to have separate methods for logical operations
  • these logical operations will be useful in validating the payloads during live testing

Suggestion to Executive Summary messaging

Dear wonderful Repo contributer:

in this issue I'm suggesting the following enhancements to the execute summary provided after running the tools:

No Validation errors found:

  • Validation completed with no issues found.
  • Number of operations validated.

Errors found:

  • Validation completed with X issues found
  • Number of operations validated.
  • Issues summary:
    -- issue 1
    -- issue 2
    -- issue 3
    .
    .
    .

Link to Validation result visualization

It would be awesome that if in the validation summary block we provide a link to an online json visualizer and pass the validation JSON for visualization of the results.

I would also be awesome if there's a story here about vs code (but that would be a separate scenario probably :) )

We should also make sure that we're respecting the licensing of that online tool

Enable live testing of swagger documents, part 2

  • Provide way to specify pre-reqs for test (i.e. creation of resources). Suggest using a template.
  • Verify actual contents of response to match values in addition to validating conformance to response type schema

Create Centralized documentation for all of our Swagger validation tools

The idea of this documentation is to point the Swagger author to all the swagger validation tools that he can use while authoring his swaggers, and provide:

  • Clear and simple instructions to install the tools
  • Clearly and simple description of how to execute the tool (with complete example commands if applicable)
  • Clearly and simple examples of what type of issues would each tool catch.
  • At which step in the process of creating the swagger should every tool be used.

We can document this in the Wiki of this tool or create a documentation folder. This will also help us in our future integration with VS Code.

Long running operation conformance tester

In order to ensure conformance with proper long running operation implementation, it would be beneficial to either be able to analyze a recording of a long running operation or have a tool that you can point at an endpoint that only tries to ensure that the proper process is followed.

Handle x-ms-paths

  • Currently the tool is blindly unifying x-ms-paths. This can be a problem if the paths are not unique. Make sure that the paths are unique.

Running against a Raw URL throws Parsing errors

Ran the command:
node validate.js spec "https://github.com/Azure/openapi-validation-tools/blob/master/arm-mediaservices/2015-10-01/swagger/media.json" --json

And got the following error, there seems to be an issue with fetching the json or I'm running it wrong If so let's make more idiomatic for AWESOME people like me:

C:\Users\sameal\workspace\openapi-validation-tools-master>node validate.js example https://github.com/Azure/openapi-validation-tools/blob/master/arm-mediaservices/2015-10-01/swagger/media.json --json

Validating "examples" and "x-ms-examples" in https://github.com/Azure/openapi-validation-tools/blob/master/arm-mediaservices/2015-10-01/swagger/media.json:

PARSE_SPEC_ERROR - Error occurred in parsing the spec "https://github.com/Azure/openapi-validation-tools/blob/master/arm-mediaservices/2015-10-01/swagger/media.json". Error parsing https://github.com/Azure/openapi-validation-tools/blob/master/arm-mediaservices/2015-10-01/swagger/media.json
end of the stream or a document separator is expected at line 7, column 19:
<head prefix="og: http://ogp.me/ns# fb: http://o ...
^.

x-ms-odata to be interpreted by the validation tool

When running validate.js for arm-authorization spec and recordings from .net tests, I got the following warnings:
[ { code: 'UNUSED_DEFINITION',
message: 'Definition is defined but is not used: #/definitions/RoleAssignmentFilter',
path: [ 'definitions', 'RoleAssignmentFilter' ] },
{ code: 'UNUSED_DEFINITION',
message: 'Definition is defined but is not used: #/definitions/RoleDefinitionFilter',
path: [ 'definitions', 'RoleDefinitionFilter' ] } ],
code: 'INTERNAL_ERROR' } ] }

The definition is used in x-ms-odata extension (see below), it looks like a proper usage of it, thoughts? Because it’s our extension I assume is not being picked up by swagger tools, so we’d have to figure out how to account for it.

"/{scope}/providers/Microsoft.Authorization/roleDefinitions": {
"get": {
"tags": [
"RoleDefinitions"
],
"operationId": "RoleDefinitions_List",
"description": "Get all role definitions that are applicable at scope and above.",
"parameters": [
{
"name": "scope",
"in": "path",
"required": true,
"type": "string",
"description": "The scope of the role definition.",
"x-ms-skip-url-encoding": true
},
{
"name": "$filter",
"in": "query",
"required": false,
"type": "string",
"description": "The filter to apply on the operation. Use atScopeAndBelow filter to search below the given scope as well."
},
{
"$ref": "#/parameters/ApiVersionParameter"
}
],
"responses": {
"200": {
"description": "OK - Returns an array of role definitions.",
"schema": {
"$ref": "#/definitions/RoleDefinitionListResult"
}
}
},
"x-ms-pageable": {
"nextLinkName": "nextLink"
},
"x-ms-odata": "#/definitions/RoleDefinitionFilter",

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.