Commit adb15d79 authored by Hirusha's avatar Hirusha

delete nodemdel

parent a3dedad1
Pipeline #7269 canceled with stages

Too many changes to show.

To preserve performance only 1000 of 1000+ files are displayed.

../acorn/bin/acorn
\ No newline at end of file
../escodegen/bin/escodegen.js
\ No newline at end of file
../escodegen/bin/esgenerate.js
\ No newline at end of file
../esprima/bin/esparse.js
\ No newline at end of file
../esprima/bin/esvalidate.js
\ No newline at end of file
../fast-xml-parser/src/cli/cli.js
\ No newline at end of file
../js-yaml/bin/js-yaml.js
\ No newline at end of file
../mime/cli.js
\ No newline at end of file
../mkdirp/bin/cmd.js
\ No newline at end of file
../nodemon/bin/nodemon.js
\ No newline at end of file
../touch/bin/nodetouch.js
\ No newline at end of file
../nopt/bin/nopt.js
\ No newline at end of file
../semver/bin/semver
\ No newline at end of file
../swagger-jsdoc/bin/swagger-jsdoc.js
\ No newline at end of file
../uuid/dist/bin/uuid
\ No newline at end of file
../vm2/bin/vm2
\ No newline at end of file
../z-schema/bin/z-schema
\ No newline at end of file
This source diff could not be displayed because it is too large. You can view the blob instead.
The MIT License (MIT)
Copyright (c) 2015 James Messinger
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
JSON Schema $Ref Parser
============================
#### Parse, Resolve, and Dereference JSON Schema $ref pointers
[![Build Status](https://github.com/APIDevTools/json-schema-ref-parser/workflows/CI-CD/badge.svg?branch=master)](https://github.com/APIDevTools/json-schema-ref-parser/actions)
[![Coverage Status](https://coveralls.io/repos/github/APIDevTools/json-schema-ref-parser/badge.svg?branch=master)](https://coveralls.io/github/APIDevTools/json-schema-ref-parser)
[![npm](https://img.shields.io/npm/v/@apidevtools/json-schema-ref-parser.svg)](https://www.npmjs.com/package/@apidevtools/json-schema-ref-parser)
[![Dependencies](https://david-dm.org/APIDevTools/json-schema-ref-parser.svg)](https://david-dm.org/APIDevTools/json-schema-ref-parser)
[![License](https://img.shields.io/npm/l/@apidevtools/json-schema-ref-parser.svg)](LICENSE)
[![Buy us a tree](https://img.shields.io/badge/Treeware-%F0%9F%8C%B3-lightgreen)](https://plant.treeware.earth/APIDevTools/json-schema-ref-parser)
[![OS and Browser Compatibility](https://apitools.dev/img/badges/ci-badges-with-ie.svg)](https://github.com/APIDevTools/json-schema-ref-parser/actions)
The Problem:
--------------------------
You've got a JSON Schema with `$ref` pointers to other files and/or URLs. Maybe you know all the referenced files ahead of time. Maybe you don't. Maybe some are local files, and others are remote URLs. Maybe they are a mix of JSON and YAML format. Maybe some of the files contain cross-references to each other.
```javascript
{
"definitions": {
"person": {
// references an external file
"$ref": "schemas/people/Bruce-Wayne.json"
},
"place": {
// references a sub-schema in an external file
"$ref": "schemas/places.yaml#/definitions/Gotham-City"
},
"thing": {
// references a URL
"$ref": "http://wayne-enterprises.com/things/batmobile"
},
"color": {
// references a value in an external file via an internal reference
"$ref": "#/definitions/thing/properties/colors/black-as-the-night"
}
}
}
```
The Solution:
--------------------------
JSON Schema $Ref Parser is a full [JSON Reference](https://tools.ietf.org/html/draft-pbryan-zyp-json-ref-03) and [JSON Pointer](https://tools.ietf.org/html/rfc6901) implementation that crawls even the most complex [JSON Schemas](http://json-schema.org/latest/json-schema-core.html) and gives you simple, straightforward JavaScript objects.
- Use **JSON** or **YAML** schemas — or even a mix of both!
- Supports `$ref` pointers to external files and URLs, as well as [custom sources](https://apitools.dev/json-schema-ref-parser/docs/plugins/resolvers.html) such as databases
- Can [bundle](https://apitools.dev/json-schema-ref-parser/docs/ref-parser.html#bundlepath-options-callback) multiple files into a single schema that only has _internal_ `$ref` pointers
- Can [dereference](https://apitools.dev/json-schema-ref-parser/docs/ref-parser.html#dereferencepath-options-callback) your schema, producing a plain-old JavaScript object that's easy to work with
- Supports [circular references](https://apitools.dev/json-schema-ref-parser/docs/#circular-refs), nested references, back-references, and cross-references between files
- Maintains object reference equality — `$ref` pointers to the same value always resolve to the same object instance
- Tested in Node v10, v12, & v14, and all major web browsers on Windows, Mac, and Linux
Example
--------------------------
```javascript
$RefParser.dereference(mySchema, (err, schema) => {
if (err) {
console.error(err);
}
else {
// `schema` is just a normal JavaScript object that contains your entire JSON Schema,
// including referenced files, combined into a single object
console.log(schema.definitions.person.properties.firstName);
}
})
```
Or use `async`/`await` syntax instead. The following example is the same as above:
```javascript
try {
let schema = await $RefParser.dereference(mySchema);
console.log(schema.definitions.person.properties.firstName);
}
catch(err) {
console.error(err);
}
```
For more detailed examples, please see the [API Documentation](https://apitools.dev/json-schema-ref-parser/docs/)
Installation
--------------------------
Install using [npm](https://docs.npmjs.com/about-npm/):
```bash
npm install @apidevtools/json-schema-ref-parser
```
Usage
--------------------------
When using JSON Schema $Ref Parser in Node.js apps, you'll probably want to use **CommonJS** syntax:
```javascript
const $RefParser = require("@apidevtools/json-schema-ref-parser");
```
When using a transpiler such as [Babel](https://babeljs.io/) or [TypeScript](https://www.typescriptlang.org/), or a bundler such as [Webpack](https://webpack.js.org/) or [Rollup](https://rollupjs.org/), you can use **ECMAScript modules** syntax instead:
```javascript
import $RefParser from "@apidevtools/json-schema-ref-parser";
```
Browser support
--------------------------
JSON Schema $Ref Parser supports recent versions of every major web browser. Older browsers may require [Babel](https://babeljs.io/) and/or [polyfills](https://babeljs.io/docs/en/next/babel-polyfill).
To use JSON Schema $Ref Parser in a browser, you'll need to use a bundling tool such as [Webpack](https://webpack.js.org/), [Rollup](https://rollupjs.org/), [Parcel](https://parceljs.org/), or [Browserify](http://browserify.org/). Some bundlers may require a bit of configuration, such as setting `browser: true` in [rollup-plugin-resolve](https://github.com/rollup/rollup-plugin-node-resolve).
API Documentation
--------------------------
Full API documentation is available [right here](https://apitools.dev/json-schema-ref-parser/docs/)
Contributing
--------------------------
I welcome any contributions, enhancements, and bug-fixes. [Open an issue](https://github.com/APIDevTools/json-schema-ref-parser/issues) on GitHub and [submit a pull request](https://github.com/APIDevTools/json-schema-ref-parser/pulls).
#### Building/Testing
To build/test the project locally on your computer:
1. __Clone this repo__<br>
`git clone https://github.com/APIDevTools/json-schema-ref-parser.git`
2. __Install dependencies__<br>
`npm install`
3. __Run the tests__<br>
`npm test`
License
--------------------------
JSON Schema $Ref Parser is 100% free and open-source, under the [MIT license](LICENSE). Use it however you want.
This package is [Treeware](http://treeware.earth). If you use it in production, then we ask that you [**buy the world a tree**](https://plant.treeware.earth/APIDevTools/json-schema-ref-parser) to thank us for our work. By contributing to the Treeware forest you’ll be creating employment for local families and restoring wildlife habitats.
Big Thanks To
--------------------------
Thanks to these awesome companies for their support of Open Source developers ❤
[![Stoplight](https://svgshare.com/i/TK5.svg)](https://stoplight.io/?utm_source=github&utm_medium=readme&utm_campaign=json_schema_ref_parser)
[![SauceLabs](https://jstools.dev/img/badges/sauce-labs.svg)](https://saucelabs.com)
[![Coveralls](https://jstools.dev/img/badges/coveralls.svg)](https://coveralls.io)
"use strict";
const $Ref = require("./ref");
const Pointer = require("./pointer");
const { ono } = require("@jsdevtools/ono");
const url = require("./util/url");
module.exports = dereference;
/**
* Crawls the JSON schema, finds all JSON references, and dereferences them.
* This method mutates the JSON schema object, replacing JSON references with their resolved value.
*
* @param {$RefParser} parser
* @param {$RefParserOptions} options
*/
function dereference (parser, options) {
// console.log('Dereferencing $ref pointers in %s', parser.$refs._root$Ref.path);
let dereferenced = crawl(parser.schema, parser.$refs._root$Ref.path, "#", new Set(), new Set(), new Map(), parser.$refs, options);
parser.$refs.circular = dereferenced.circular;
parser.schema = dereferenced.value;
}
/**
* Recursively crawls the given value, and dereferences any JSON references.
*
* @param {*} obj - The value to crawl. If it's not an object or array, it will be ignored.
* @param {string} path - The full path of `obj`, possibly with a JSON Pointer in the hash
* @param {string} pathFromRoot - The path of `obj` from the schema root
* @param {Set<object>} parents - An array of the parent objects that have already been dereferenced
* @param {Set<object>} processedObjects - An array of all the objects that have already been processed
* @param {Map<string,object>} dereferencedCache - An map of all the dereferenced objects
* @param {$Refs} $refs
* @param {$RefParserOptions} options
* @returns {{value: object, circular: boolean}}
*/
function crawl (obj, path, pathFromRoot, parents, processedObjects, dereferencedCache, $refs, options) {
let dereferenced;
let result = {
value: obj,
circular: false
};
let isExcludedPath = options.dereference.excludedPathMatcher;
if (options.dereference.circular === "ignore" || !processedObjects.has(obj)) {
if (obj && typeof obj === "object" && !ArrayBuffer.isView(obj) && !isExcludedPath(pathFromRoot)) {
parents.add(obj);
processedObjects.add(obj);
if ($Ref.isAllowed$Ref(obj, options)) {
dereferenced = dereference$Ref(obj, path, pathFromRoot, parents, processedObjects, dereferencedCache, $refs, options);
result.circular = dereferenced.circular;
result.value = dereferenced.value;
}
else {
for (const key of Object.keys(obj)) {
let keyPath = Pointer.join(path, key);
let keyPathFromRoot = Pointer.join(pathFromRoot, key);
if (isExcludedPath(keyPathFromRoot)) {
continue;
}
let value = obj[key];
let circular = false;
if ($Ref.isAllowed$Ref(value, options)) {
dereferenced = dereference$Ref(value, keyPath, keyPathFromRoot, parents, processedObjects, dereferencedCache, $refs, options);
circular = dereferenced.circular;
// Avoid pointless mutations; breaks frozen objects to no profit
if (obj[key] !== dereferenced.value) {
obj[key] = dereferenced.value;
}
}
else {
if (!parents.has(value)) {
dereferenced = crawl(value, keyPath, keyPathFromRoot, parents, processedObjects, dereferencedCache, $refs, options);
circular = dereferenced.circular;
// Avoid pointless mutations; breaks frozen objects to no profit
if (obj[key] !== dereferenced.value) {
obj[key] = dereferenced.value;
}
}
else {
circular = foundCircularReference(keyPath, $refs, options);
}
}
// Set the "isCircular" flag if this or any other property is circular
result.circular = result.circular || circular;
}
}
parents.delete(obj);
}
}
return result;
}
/**
* Dereferences the given JSON Reference, and then crawls the resulting value.
*
* @param {{$ref: string}} $ref - The JSON Reference to resolve
* @param {string} path - The full path of `$ref`, possibly with a JSON Pointer in the hash
* @param {string} pathFromRoot - The path of `$ref` from the schema root
* @param {Set<object>} parents - An array of the parent objects that have already been dereferenced
* @param {Set<object>} processedObjects - An array of all the objects that have already been dereferenced
* @param {Map<string,object>} dereferencedCache - An map of all the dereferenced objects
* @param {$Refs} $refs
* @param {$RefParserOptions} options
* @returns {{value: object, circular: boolean}}
*/
function dereference$Ref ($ref, path, pathFromRoot, parents, processedObjects, dereferencedCache, $refs, options) {
// console.log('Dereferencing $ref pointer "%s" at %s', $ref.$ref, path);
let $refPath = url.resolve(path, $ref.$ref);
const cache = dereferencedCache.get($refPath);
if (cache) {
const refKeys = Object.keys($ref);
if (refKeys.length > 1) {
const extraKeys = {};
for (let key of refKeys) {
if (key !== "$ref" && !(key in cache.value)) {
extraKeys[key] = $ref[key];
}
}
return {
circular: cache.circular,
value: Object.assign({}, cache.value, extraKeys),
};
}
return cache;
}
let pointer = $refs._resolve($refPath, path, options);
if (pointer === null) {
return {
circular: false,
value: null,
};
}
// Check for circular references
let directCircular = pointer.circular;
let circular = directCircular || parents.has(pointer.value);
circular && foundCircularReference(path, $refs, options);
// Dereference the JSON reference
let dereferencedValue = $Ref.dereference($ref, pointer.value);
// Crawl the dereferenced value (unless it's circular)
if (!circular) {
// Determine if the dereferenced value is circular
let dereferenced = crawl(dereferencedValue, pointer.path, pathFromRoot, parents, processedObjects, dereferencedCache, $refs, options);
circular = dereferenced.circular;
dereferencedValue = dereferenced.value;
}
if (circular && !directCircular && options.dereference.circular === "ignore") {
// The user has chosen to "ignore" circular references, so don't change the value
dereferencedValue = $ref;
}
if (directCircular) {
// The pointer is a DIRECT circular reference (i.e. it references itself).
// So replace the $ref path with the absolute path from the JSON Schema root
dereferencedValue.$ref = pathFromRoot;
}
const dereferencedObject = {
circular,
value: dereferencedValue
};
// only cache if no extra properties than $ref
if (Object.keys($ref).length === 1) {
dereferencedCache.set($refPath, dereferencedObject);
}
return dereferencedObject;
}
/**
* Called when a circular reference is found.
* It sets the {@link $Refs#circular} flag, and throws an error if options.dereference.circular is false.
*
* @param {string} keyPath - The JSON Reference path of the circular reference
* @param {$Refs} $refs
* @param {$RefParserOptions} options
* @returns {boolean} - always returns true, to indicate that a circular reference was found
*/
function foundCircularReference (keyPath, $refs, options) {
$refs.circular = true;
if (!options.dereference.circular) {
throw ono.reference(`Circular $ref pointer found at ${keyPath}`);
}
return true;
}
"use strict";
const Options = require("./options");
module.exports = normalizeArgs;
/**
* Normalizes the given arguments, accounting for optional args.
*
* @param {Arguments} args
* @returns {object}
*/
function normalizeArgs (args) {
let path, schema, options, callback;
args = Array.prototype.slice.call(args);
if (typeof args[args.length - 1] === "function") {
// The last parameter is a callback function
callback = args.pop();
}
if (typeof args[0] === "string") {
// The first parameter is the path
path = args[0];
if (typeof args[2] === "object") {
// The second parameter is the schema, and the third parameter is the options
schema = args[1];
options = args[2];
}
else {
// The second parameter is the options
schema = undefined;
options = args[1];
}
}
else {
// The first parameter is the schema
path = "";
schema = args[0];
options = args[1];
}
if (!(options instanceof Options)) {
options = new Options(options);
}
return {
path,
schema,
options,
callback
};
}
/* eslint lines-around-comment: [2, {beforeBlockComment: false}] */
"use strict";
const jsonParser = require("./parsers/json");
const yamlParser = require("./parsers/yaml");
const textParser = require("./parsers/text");
const binaryParser = require("./parsers/binary");
const fileResolver = require("./resolvers/file");
const httpResolver = require("./resolvers/http");
module.exports = $RefParserOptions;
/**
* Options that determine how JSON schemas are parsed, resolved, and dereferenced.
*
* @param {object|$RefParserOptions} [options] - Overridden options
* @constructor
*/
function $RefParserOptions (options) {
merge(this, $RefParserOptions.defaults);
merge(this, options);
}
$RefParserOptions.defaults = {
/**
* Determines how different types of files will be parsed.
*
* You can add additional parsers of your own, replace an existing one with
* your own implementation, or disable any parser by setting it to false.
*/
parse: {
json: jsonParser,
yaml: yamlParser,
text: textParser,
binary: binaryParser,
},
/**
* Determines how JSON References will be resolved.
*
* You can add additional resolvers of your own, replace an existing one with
* your own implementation, or disable any resolver by setting it to false.
*/
resolve: {
file: fileResolver,
http: httpResolver,
/**
* Determines whether external $ref pointers will be resolved.
* If this option is disabled, then none of above resolvers will be called.
* Instead, external $ref pointers will simply be ignored.
*
* @type {boolean}
*/
external: true,
},
/**
* By default, JSON Schema $Ref Parser throws the first error it encounters. Setting `continueOnError` to `true`
* causes it to keep processing as much as possible and then throw a single error that contains all errors
* that were encountered.
*/
continueOnError: false,
/**
* Determines the types of JSON references that are allowed.
*/
dereference: {
/**
* Dereference circular (recursive) JSON references?
* If false, then a {@link ReferenceError} will be thrown if a circular reference is found.
* If "ignore", then circular references will not be dereferenced.
*
* @type {boolean|string}
*/
circular: true,
/**
* A function, called for each path, which can return true to stop this path and all
* subpaths from being dereferenced further. This is useful in schemas where some
* subpaths contain literal $ref keys that should not be dereferenced.
*
* @type {function}
*/
excludedPathMatcher: () => false
},
};
/**
* Merges the properties of the source object into the target object.
*
* @param {object} target - The object that we're populating
* @param {?object} source - The options that are being merged
* @returns {object}
*/
function merge (target, source) {
if (isMergeable(source)) {
let keys = Object.keys(source);
for (let i = 0; i < keys.length; i++) {
let key = keys[i];
let sourceSetting = source[key];
let targetSetting = target[key];
if (isMergeable(sourceSetting)) {
// It's a nested object, so merge it recursively
target[key] = merge(targetSetting || {}, sourceSetting);
}
else if (sourceSetting !== undefined) {
// It's a scalar value, function, or array. No merging necessary. Just overwrite the target value.
target[key] = sourceSetting;
}
}
}
return target;
}
/**
* Determines whether the given value can be merged,
* or if it is a scalar value that should just override the target value.
*
* @param {*} val
* @returns {Boolean}
*/
function isMergeable (val) {
return val &&
(typeof val === "object") &&
!Array.isArray(val) &&
!(val instanceof RegExp) &&
!(val instanceof Date);
}
"use strict";
const { ono } = require("@jsdevtools/ono");
const url = require("./util/url");
const plugins = require("./util/plugins");
const { ResolverError, ParserError, UnmatchedParserError, UnmatchedResolverError, isHandledError } = require("./util/errors");
module.exports = parse;
/**
* Reads and parses the specified file path or URL.
*
* @param {string} path - This path MUST already be resolved, since `read` doesn't know the resolution context
* @param {$Refs} $refs
* @param {$RefParserOptions} options
*
* @returns {Promise}
* The promise resolves with the parsed file contents, NOT the raw (Buffer) contents.
*/
async function parse (path, $refs, options) {
// Remove the URL fragment, if any
path = url.stripHash(path);
// Add a new $Ref for this file, even though we don't have the value yet.
// This ensures that we don't simultaneously read & parse the same file multiple times
let $ref = $refs._add(path);
// This "file object" will be passed to all resolvers and parsers.
let file = {
url: path,
extension: url.getExtension(path),
};
// Read the file and then parse the data
try {
const resolver = await readFile(file, options, $refs);
$ref.pathType = resolver.plugin.name;
file.data = resolver.result;
const parser = await parseFile(file, options, $refs);
$ref.value = parser.result;
return parser.result;
}
catch (err) {
if (isHandledError(err)) {
$ref.value = err;
}
throw err;
}
}
/**
* Reads the given file, using the configured resolver plugins
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {$RefParserOptions} options
*
* @returns {Promise}
* The promise resolves with the raw file contents and the resolver that was used.
*/
function readFile (file, options, $refs) {
return new Promise(((resolve, reject) => {
// console.log('Reading %s', file.url);
// Find the resolvers that can read this file
let resolvers = plugins.all(options.resolve);
resolvers = plugins.filter(resolvers, "canRead", file);
// Run the resolvers, in order, until one of them succeeds
plugins.sort(resolvers);
plugins.run(resolvers, "read", file, $refs)
.then(resolve, onError);
function onError (err) {
if (!err && options.continueOnError) {
// No resolver could be matched
reject(new UnmatchedResolverError(file.url));
}
else if (!err || !("error" in err)) {
// Throw a generic, friendly error.
reject(ono.syntax(`Unable to resolve $ref pointer "${file.url}"`));
}
// Throw the original error, if it's one of our own (user-friendly) errors.
else if (err.error instanceof ResolverError) {
reject(err.error);
}
else {
reject(new ResolverError(err, file.url));
}
}
}));
}
/**
* Parses the given file's contents, using the configured parser plugins.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @param {$RefParserOptions} options
*
* @returns {Promise}
* The promise resolves with the parsed file contents and the parser that was used.
*/
function parseFile (file, options, $refs) {
return new Promise(((resolve, reject) => {
// console.log('Parsing %s', file.url);
// Find the parsers that can read this file type.
// If none of the parsers are an exact match for this file, then we'll try ALL of them.
// This handles situations where the file IS a supported type, just with an unknown extension.
let allParsers = plugins.all(options.parse);
let filteredParsers = plugins.filter(allParsers, "canParse", file);
let parsers = filteredParsers.length > 0 ? filteredParsers : allParsers;
// Run the parsers, in order, until one of them succeeds
plugins.sort(parsers);
plugins.run(parsers, "parse", file, $refs)
.then(onParsed, onError);
function onParsed (parser) {
if (!parser.plugin.allowEmpty && isEmpty(parser.result)) {
reject(ono.syntax(`Error parsing "${file.url}" as ${parser.plugin.name}. \nParsed value is empty`));
}
else {
resolve(parser);
}
}
function onError (err) {
if (!err && options.continueOnError) {
// No resolver could be matched
reject(new UnmatchedParserError(file.url));
}
else if (!err || !("error" in err)) {
reject(ono.syntax(`Unable to parse ${file.url}`));
}
else if (err.error instanceof ParserError) {
reject(err.error);
}
else {
reject(new ParserError(err.error.message, file.url));
}
}
}));
}
/**
* Determines whether the parsed value is "empty".
*
* @param {*} value
* @returns {boolean}
*/
function isEmpty (value) {
return value === undefined ||
(typeof value === "object" && Object.keys(value).length === 0) ||
(typeof value === "string" && value.trim().length === 0) ||
(Buffer.isBuffer(value) && value.length === 0);
}
"use strict";
let BINARY_REGEXP = /\.(jpeg|jpg|gif|png|bmp|ico)$/i;
module.exports = {
/**
* The order that this parser will run, in relation to other parsers.
*
* @type {number}
*/
order: 400,
/**
* Whether to allow "empty" files (zero bytes).
*
* @type {boolean}
*/
allowEmpty: true,
/**
* Determines whether this parser can parse a given file reference.
* Parsers that return true will be tried, in order, until one successfully parses the file.
* Parsers that return false will be skipped, UNLESS all parsers returned false, in which case
* every parser will be tried.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {boolean}
*/
canParse (file) {
// Use this parser if the file is a Buffer, and has a known binary extension
return Buffer.isBuffer(file.data) && BINARY_REGEXP.test(file.url);
},
/**
* Parses the given data as a Buffer (byte array).
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {Buffer}
*/
parse (file) {
if (Buffer.isBuffer(file.data)) {
return file.data;
}
else {
// This will reject if data is anything other than a string or typed array
return Buffer.from(file.data);
}
}
};
"use strict";
const { ParserError } = require("../util/errors");
module.exports = {
/**
* The order that this parser will run, in relation to other parsers.
*
* @type {number}
*/
order: 100,
/**
* Whether to allow "empty" files. This includes zero-byte files, as well as empty JSON objects.
*
* @type {boolean}
*/
allowEmpty: true,
/**
* Determines whether this parser can parse a given file reference.
* Parsers that match will be tried, in order, until one successfully parses the file.
* Parsers that don't match will be skipped, UNLESS none of the parsers match, in which case
* every parser will be tried.
*
* @type {RegExp|string|string[]|function}
*/
canParse: ".json",
/**
* Parses the given file as JSON
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {Promise}
*/
async parse (file) { // eslint-disable-line require-await
let data = file.data;
if (Buffer.isBuffer(data)) {
data = data.toString();
}
if (typeof data === "string") {
if (data.trim().length === 0) {
return; // This mirrors the YAML behavior
}
else {
try {
return JSON.parse(data);
}
catch (e) {
throw new ParserError(e.message, file.url);
}
}
}
else {
// data is already a JavaScript value (object, array, number, null, NaN, etc.)
return data;
}
}
};
"use strict";
const { ParserError } = require("../util/errors");
let TEXT_REGEXP = /\.(txt|htm|html|md|xml|js|min|map|css|scss|less|svg)$/i;
module.exports = {
/**
* The order that this parser will run, in relation to other parsers.
*
* @type {number}
*/
order: 300,
/**
* Whether to allow "empty" files (zero bytes).
*
* @type {boolean}
*/
allowEmpty: true,
/**
* The encoding that the text is expected to be in.
*
* @type {string}
*/
encoding: "utf8",
/**
* Determines whether this parser can parse a given file reference.
* Parsers that return true will be tried, in order, until one successfully parses the file.
* Parsers that return false will be skipped, UNLESS all parsers returned false, in which case
* every parser will be tried.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {boolean}
*/
canParse (file) {
// Use this parser if the file is a string or Buffer, and has a known text-based extension
return (typeof file.data === "string" || Buffer.isBuffer(file.data)) && TEXT_REGEXP.test(file.url);
},
/**
* Parses the given file as text
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {string}
*/
parse (file) {
if (typeof file.data === "string") {
return file.data;
}
else if (Buffer.isBuffer(file.data)) {
return file.data.toString(this.encoding);
}
else {
throw new ParserError("data is not text", file.url);
}
}
};
"use strict";
const { ParserError } = require("../util/errors");
const yaml = require("js-yaml");
const { JSON_SCHEMA } = require("js-yaml");
module.exports = {
/**
* The order that this parser will run, in relation to other parsers.
*
* @type {number}
*/
order: 200,
/**
* Whether to allow "empty" files. This includes zero-byte files, as well as empty JSON objects.
*
* @type {boolean}
*/
allowEmpty: true,
/**
* Determines whether this parser can parse a given file reference.
* Parsers that match will be tried, in order, until one successfully parses the file.
* Parsers that don't match will be skipped, UNLESS none of the parsers match, in which case
* every parser will be tried.
*
* @type {RegExp|string[]|function}
*/
canParse: [".yaml", ".yml", ".json"], // JSON is valid YAML
/**
* Parses the given file as YAML
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @param {*} file.data - The file contents. This will be whatever data type was returned by the resolver
* @returns {Promise}
*/
async parse (file) { // eslint-disable-line require-await
let data = file.data;
if (Buffer.isBuffer(data)) {
data = data.toString();
}
if (typeof data === "string") {
try {
return yaml.load(data, { schema: JSON_SCHEMA });
}
catch (e) {
throw new ParserError(e.message, file.url);
}
}
else {
// data is already a JavaScript value (object, array, number, null, NaN, etc.)
return data;
}
}
};
"use strict";
module.exports = Pointer;
const $Ref = require("./ref");
const url = require("./util/url");
const { JSONParserError, InvalidPointerError, MissingPointerError, isHandledError } = require("./util/errors");
const slashes = /\//g;
const tildes = /~/g;
const escapedSlash = /~1/g;
const escapedTilde = /~0/g;
/**
* This class represents a single JSON pointer and its resolved value.
*
* @param {$Ref} $ref
* @param {string} path
* @param {string} [friendlyPath] - The original user-specified path (used for error messages)
* @constructor
*/
function Pointer ($ref, path, friendlyPath) {
/**
* The {@link $Ref} object that contains this {@link Pointer} object.
* @type {$Ref}
*/
this.$ref = $ref;
/**
* The file path or URL, containing the JSON pointer in the hash.
* This path is relative to the path of the main JSON schema file.
* @type {string}
*/
this.path = path;
/**
* The original path or URL, used for error messages.
* @type {string}
*/
this.originalPath = friendlyPath || path;
/**
* The value of the JSON pointer.
* Can be any JSON type, not just objects. Unknown file types are represented as Buffers (byte arrays).
* @type {?*}
*/
this.value = undefined;
/**
* Indicates whether the pointer references itself.
* @type {boolean}
*/
this.circular = false;
/**
* The number of indirect references that were traversed to resolve the value.
* Resolving a single pointer may require resolving multiple $Refs.
* @type {number}
*/
this.indirections = 0;
}
/**
* Resolves the value of a nested property within the given object.
*
* @param {*} obj - The object that will be crawled
* @param {$RefParserOptions} options
* @param {string} pathFromRoot - the path of place that initiated resolving
*
* @returns {Pointer}
* Returns a JSON pointer whose {@link Pointer#value} is the resolved value.
* If resolving this value required resolving other JSON references, then
* the {@link Pointer#$ref} and {@link Pointer#path} will reflect the resolution path
* of the resolved value.
*/
Pointer.prototype.resolve = function (obj, options, pathFromRoot) {
let tokens = Pointer.parse(this.path, this.originalPath);
// Crawl the object, one token at a time
this.value = unwrapOrThrow(obj);
for (let i = 0; i < tokens.length; i++) {
if (resolveIf$Ref(this, options)) {
// The $ref path has changed, so append the remaining tokens to the path
this.path = Pointer.join(this.path, tokens.slice(i));
}
if (typeof this.value === "object" && this.value !== null && "$ref" in this.value) {
return this;
}
let token = tokens[i];
if (this.value[token] === undefined || this.value[token] === null) {
this.value = null;
throw new MissingPointerError(token, decodeURI(this.originalPath));
}
else {
this.value = this.value[token];
}
}
// Resolve the final value
if (!this.value || this.value.$ref && url.resolve(this.path, this.value.$ref) !== pathFromRoot) {
resolveIf$Ref(this, options);
}
return this;
};
/**
* Sets the value of a nested property within the given object.
*
* @param {*} obj - The object that will be crawled
* @param {*} value - the value to assign
* @param {$RefParserOptions} options
*
* @returns {*}
* Returns the modified object, or an entirely new object if the entire object is overwritten.
*/
Pointer.prototype.set = function (obj, value, options) {
let tokens = Pointer.parse(this.path);
let token;
if (tokens.length === 0) {
// There are no tokens, replace the entire object with the new value
this.value = value;
return value;
}
// Crawl the object, one token at a time
this.value = unwrapOrThrow(obj);
for (let i = 0; i < tokens.length - 1; i++) {
resolveIf$Ref(this, options);
token = tokens[i];
if (this.value && this.value[token] !== undefined) {
// The token exists
this.value = this.value[token];
}
else {
// The token doesn't exist, so create it
this.value = setValue(this, token, {});
}
}
// Set the value of the final token
resolveIf$Ref(this, options);
token = tokens[tokens.length - 1];
setValue(this, token, value);
// Return the updated object
return obj;
};
/**
* Parses a JSON pointer (or a path containing a JSON pointer in the hash)
* and returns an array of the pointer's tokens.
* (e.g. "schema.json#/definitions/person/name" => ["definitions", "person", "name"])
*
* The pointer is parsed according to RFC 6901
* {@link https://tools.ietf.org/html/rfc6901#section-3}
*
* @param {string} path
* @param {string} [originalPath]
* @returns {string[]}
*/
Pointer.parse = function (path, originalPath) {
// Get the JSON pointer from the path's hash
let pointer = url.getHash(path).substr(1);
// If there's no pointer, then there are no tokens,
// so return an empty array
if (!pointer) {
return [];
}
// Split into an array
pointer = pointer.split("/");
// Decode each part, according to RFC 6901
for (let i = 0; i < pointer.length; i++) {
pointer[i] = decodeURIComponent(pointer[i].replace(escapedSlash, "/").replace(escapedTilde, "~"));
}
if (pointer[0] !== "") {
throw new InvalidPointerError(pointer, originalPath === undefined ? path : originalPath);
}
return pointer.slice(1);
};
/**
* Creates a JSON pointer path, by joining one or more tokens to a base path.
*
* @param {string} base - The base path (e.g. "schema.json#/definitions/person")
* @param {string|string[]} tokens - The token(s) to append (e.g. ["name", "first"])
* @returns {string}
*/
Pointer.join = function (base, tokens) {
// Ensure that the base path contains a hash
if (base.indexOf("#") === -1) {
base += "#";
}
// Append each token to the base path
tokens = Array.isArray(tokens) ? tokens : [tokens];
for (let i = 0; i < tokens.length; i++) {
let token = tokens[i];
// Encode the token, according to RFC 6901
base += "/" + encodeURIComponent(token.replace(tildes, "~0").replace(slashes, "~1"));
}
return base;
};
/**
* If the given pointer's {@link Pointer#value} is a JSON reference,
* then the reference is resolved and {@link Pointer#value} is replaced with the resolved value.
* In addition, {@link Pointer#path} and {@link Pointer#$ref} are updated to reflect the
* resolution path of the new value.
*
* @param {Pointer} pointer
* @param {$RefParserOptions} options
* @returns {boolean} - Returns `true` if the resolution path changed
*/
function resolveIf$Ref (pointer, options) {
// Is the value a JSON reference? (and allowed?)
if ($Ref.isAllowed$Ref(pointer.value, options)) {
let $refPath = url.resolve(pointer.path, pointer.value.$ref);
if ($refPath === pointer.path) {
// The value is a reference to itself, so there's nothing to do.
pointer.circular = true;
}
else {
let resolved = pointer.$ref.$refs._resolve($refPath, pointer.path, options);
if (resolved === null) {
return false;
}
pointer.indirections += resolved.indirections + 1;
if ($Ref.isExtended$Ref(pointer.value)) {
// This JSON reference "extends" the resolved value, rather than simply pointing to it.
// So the resolved path does NOT change. Just the value does.
pointer.value = $Ref.dereference(pointer.value, resolved.value);
return false;
}
else {
// Resolve the reference
pointer.$ref = resolved.$ref;
pointer.path = resolved.path;
pointer.value = resolved.value;
}
return true;
}
}
}
/**
* Sets the specified token value of the {@link Pointer#value}.
*
* The token is evaluated according to RFC 6901.
* {@link https://tools.ietf.org/html/rfc6901#section-4}
*
* @param {Pointer} pointer - The JSON Pointer whose value will be modified
* @param {string} token - A JSON Pointer token that indicates how to modify `obj`
* @param {*} value - The value to assign
* @returns {*} - Returns the assigned value
*/
function setValue (pointer, token, value) {
if (pointer.value && typeof pointer.value === "object") {
if (token === "-" && Array.isArray(pointer.value)) {
pointer.value.push(value);
}
else {
pointer.value[token] = value;
}
}
else {
throw new JSONParserError(`Error assigning $ref pointer "${pointer.path}". \nCannot set "${token}" of a non-object.`);
}
return value;
}
function unwrapOrThrow (value) {
if (isHandledError(value)) {
throw value;
}
return value;
}
"use strict";
module.exports = $Ref;
const Pointer = require("./pointer");
const { InvalidPointerError, isHandledError, normalizeError } = require("./util/errors");
const { safePointerToPath, stripHash, getHash } = require("./util/url");
/**
* This class represents a single JSON reference and its resolved value.
*
* @class
*/
function $Ref () {
/**
* The file path or URL of the referenced file.
* This path is relative to the path of the main JSON schema file.
*
* This path does NOT contain document fragments (JSON pointers). It always references an ENTIRE file.
* Use methods such as {@link $Ref#get}, {@link $Ref#resolve}, and {@link $Ref#exists} to get
* specific JSON pointers within the file.
*
* @type {string}
*/
this.path = undefined;
/**
* The resolved value of the JSON reference.
* Can be any JSON type, not just objects. Unknown file types are represented as Buffers (byte arrays).
*
* @type {?*}
*/
this.value = undefined;
/**
* The {@link $Refs} object that contains this {@link $Ref} object.
*
* @type {$Refs}
*/
this.$refs = undefined;
/**
* Indicates the type of {@link $Ref#path} (e.g. "file", "http", etc.)
*
* @type {?string}
*/
this.pathType = undefined;
/**
* List of all errors. Undefined if no errors.
*
* @type {Array<JSONParserError | ResolverError | ParserError | MissingPointerError>}
*/
this.errors = undefined;
}
/**
* Pushes an error to errors array.
*
* @param {Array<JSONParserError | JSONParserErrorGroup>} err - The error to be pushed
* @returns {void}
*/
$Ref.prototype.addError = function (err) {
if (this.errors === undefined) {
this.errors = [];
}
const existingErrors = this.errors.map(({ footprint }) => footprint);
// the path has been almost certainly set at this point,
// but just in case something went wrong, normalizeError injects path if necessary
// moreover, certain errors might point at the same spot, so filter them out to reduce noise
if (Array.isArray(err.errors)) {
this.errors.push(...err.errors
.map(normalizeError)
.filter(({ footprint }) => !existingErrors.includes(footprint)),
);
}
else if (!existingErrors.includes(err.footprint)) {
this.errors.push(normalizeError(err));
}
};
/**
* Determines whether the given JSON reference exists within this {@link $Ref#value}.
*
* @param {string} path - The full path being resolved, optionally with a JSON pointer in the hash
* @param {$RefParserOptions} options
* @returns {boolean}
*/
$Ref.prototype.exists = function (path, options) {
try {
this.resolve(path, options);
return true;
}
catch (e) {
return false;
}
};
/**
* Resolves the given JSON reference within this {@link $Ref#value} and returns the resolved value.
*
* @param {string} path - The full path being resolved, optionally with a JSON pointer in the hash
* @param {$RefParserOptions} options
* @returns {*} - Returns the resolved value
*/
$Ref.prototype.get = function (path, options) {
return this.resolve(path, options).value;
};
/**
* Resolves the given JSON reference within this {@link $Ref#value}.
*
* @param {string} path - The full path being resolved, optionally with a JSON pointer in the hash
* @param {$RefParserOptions} options
* @param {string} friendlyPath - The original user-specified path (used for error messages)
* @param {string} pathFromRoot - The path of `obj` from the schema root
* @returns {Pointer | null}
*/
$Ref.prototype.resolve = function (path, options, friendlyPath, pathFromRoot) {
let pointer = new Pointer(this, path, friendlyPath);
try {
return pointer.resolve(this.value, options, pathFromRoot);
}
catch (err) {
if (!options || !options.continueOnError || !isHandledError(err)) {
throw err;
}
if (err.path === null) {
err.path = safePointerToPath(getHash(pathFromRoot));
}
if (err instanceof InvalidPointerError) {
// this is a special case - InvalidPointerError is thrown when dereferencing external file,
// but the issue is caused by the source file that referenced the file that undergoes dereferencing
err.source = decodeURI(stripHash(pathFromRoot));
}
this.addError(err);
return null;
}
};
/**
* Sets the value of a nested property within this {@link $Ref#value}.
* If the property, or any of its parents don't exist, they will be created.
*
* @param {string} path - The full path of the property to set, optionally with a JSON pointer in the hash
* @param {*} value - The value to assign
*/
$Ref.prototype.set = function (path, value) {
let pointer = new Pointer(this, path);
this.value = pointer.set(this.value, value);
};
/**
* Determines whether the given value is a JSON reference.
*
* @param {*} value - The value to inspect
* @returns {boolean}
*/
$Ref.is$Ref = function (value) {
return value && typeof value === "object" && typeof value.$ref === "string" && value.$ref.length > 0;
};
/**
* Determines whether the given value is an external JSON reference.
*
* @param {*} value - The value to inspect
* @returns {boolean}
*/
$Ref.isExternal$Ref = function (value) {
return $Ref.is$Ref(value) && value.$ref[0] !== "#";
};
/**
* Determines whether the given value is a JSON reference, and whether it is allowed by the options.
* For example, if it references an external file, then options.resolve.external must be true.
*
* @param {*} value - The value to inspect
* @param {$RefParserOptions} options
* @returns {boolean}
*/
$Ref.isAllowed$Ref = function (value, options) {
if ($Ref.is$Ref(value)) {
if (value.$ref.substr(0, 2) === "#/" || value.$ref === "#") {
// It's a JSON Pointer reference, which is always allowed
return true;
}
else if (value.$ref[0] !== "#" && (!options || options.resolve.external)) {
// It's an external reference, which is allowed by the options
return true;
}
}
};
/**
* Determines whether the given value is a JSON reference that "extends" its resolved value.
* That is, it has extra properties (in addition to "$ref"), so rather than simply pointing to
* an existing value, this $ref actually creates a NEW value that is a shallow copy of the resolved
* value, plus the extra properties.
*
* @example:
* {
* person: {
* properties: {
* firstName: { type: string }
* lastName: { type: string }
* }
* }
* employee: {
* properties: {
* $ref: #/person/properties
* salary: { type: number }
* }
* }
* }
*
* In this example, "employee" is an extended $ref, since it extends "person" with an additional
* property (salary). The result is a NEW value that looks like this:
*
* {
* properties: {
* firstName: { type: string }
* lastName: { type: string }
* salary: { type: number }
* }
* }
*
* @param {*} value - The value to inspect
* @returns {boolean}
*/
$Ref.isExtended$Ref = function (value) {
return $Ref.is$Ref(value) && Object.keys(value).length > 1;
};
/**
* Returns the resolved value of a JSON Reference.
* If necessary, the resolved value is merged with the JSON Reference to create a new object
*
* @example:
* {
* person: {
* properties: {
* firstName: { type: string }
* lastName: { type: string }
* }
* }
* employee: {
* properties: {
* $ref: #/person/properties
* salary: { type: number }
* }
* }
* }
*
* When "person" and "employee" are merged, you end up with the following object:
*
* {
* properties: {
* firstName: { type: string }
* lastName: { type: string }
* salary: { type: number }
* }
* }
*
* @param {object} $ref - The JSON reference object (the one with the "$ref" property)
* @param {*} resolvedValue - The resolved value, which can be any type
* @returns {*} - Returns the dereferenced value
*/
$Ref.dereference = function ($ref, resolvedValue) {
if (resolvedValue && typeof resolvedValue === "object" && $Ref.isExtended$Ref($ref)) {
let merged = {};
for (let key of Object.keys($ref)) {
if (key !== "$ref") {
merged[key] = $ref[key];
}
}
for (let key of Object.keys(resolvedValue)) {
if (!(key in merged)) {
merged[key] = resolvedValue[key];
}
}
return merged;
}
else {
// Completely replace the original reference with the resolved value
return resolvedValue;
}
};
"use strict";
const { ono } = require("@jsdevtools/ono");
const $Ref = require("./ref");
const url = require("./util/url");
module.exports = $Refs;
/**
* This class is a map of JSON references and their resolved values.
*/
function $Refs () {
/**
* Indicates whether the schema contains any circular references.
*
* @type {boolean}
*/
this.circular = false;
/**
* A map of paths/urls to {@link $Ref} objects
*
* @type {object}
* @protected
*/
this._$refs = {};
/**
* The {@link $Ref} object that is the root of the JSON schema.
*
* @type {$Ref}
* @protected
*/
this._root$Ref = null;
}
/**
* Returns the paths of all the files/URLs that are referenced by the JSON schema,
* including the schema itself.
*
* @param {...string|string[]} [types] - Only return paths of the given types ("file", "http", etc.)
* @returns {string[]}
*/
$Refs.prototype.paths = function (types) { // eslint-disable-line no-unused-vars
let paths = getPaths(this._$refs, arguments);
return paths.map((path) => {
return path.decoded;
});
};
/**
* Returns the map of JSON references and their resolved values.
*
* @param {...string|string[]} [types] - Only return references of the given types ("file", "http", etc.)
* @returns {object}
*/
$Refs.prototype.values = function (types) { // eslint-disable-line no-unused-vars
let $refs = this._$refs;
let paths = getPaths($refs, arguments);
return paths.reduce((obj, path) => {
obj[path.decoded] = $refs[path.encoded].value;
return obj;
}, {});
};
/**
* Returns a POJO (plain old JavaScript object) for serialization as JSON.
*
* @returns {object}
*/
$Refs.prototype.toJSON = $Refs.prototype.values;
/**
* Determines whether the given JSON reference exists.
*
* @param {string} path - The path being resolved, optionally with a JSON pointer in the hash
* @param {$RefParserOptions} [options]
* @returns {boolean}
*/
$Refs.prototype.exists = function (path, options) {
try {
this._resolve(path, "", options);
return true;
}
catch (e) {
return false;
}
};
/**
* Resolves the given JSON reference and returns the resolved value.
*
* @param {string} path - The path being resolved, with a JSON pointer in the hash
* @param {$RefParserOptions} [options]
* @returns {*} - Returns the resolved value
*/
$Refs.prototype.get = function (path, options) {
return this._resolve(path, "", options).value;
};
/**
* Sets the value of a nested property within this {@link $Ref#value}.
* If the property, or any of its parents don't exist, they will be created.
*
* @param {string} path - The path of the property to set, optionally with a JSON pointer in the hash
* @param {*} value - The value to assign
*/
$Refs.prototype.set = function (path, value) {
let absPath = url.resolve(this._root$Ref.path, path);
let withoutHash = url.stripHash(absPath);
let $ref = this._$refs[withoutHash];
if (!$ref) {
throw ono(`Error resolving $ref pointer "${path}". \n"${withoutHash}" not found.`);
}
$ref.set(absPath, value);
};
/**
* Creates a new {@link $Ref} object and adds it to this {@link $Refs} object.
*
* @param {string} path - The file path or URL of the referenced file
*/
$Refs.prototype._add = function (path) {
let withoutHash = url.stripHash(path);
let $ref = new $Ref();
$ref.path = withoutHash;
$ref.$refs = this;
this._$refs[withoutHash] = $ref;
this._root$Ref = this._root$Ref || $ref;
return $ref;
};
/**
* Resolves the given JSON reference.
*
* @param {string} path - The path being resolved, optionally with a JSON pointer in the hash
* @param {string} pathFromRoot - The path of `obj` from the schema root
* @param {$RefParserOptions} [options]
* @returns {Pointer}
* @protected
*/
$Refs.prototype._resolve = function (path, pathFromRoot, options) {
let absPath = url.resolve(this._root$Ref.path, path);
let withoutHash = url.stripHash(absPath);
let $ref = this._$refs[withoutHash];
if (!$ref) {
throw ono(`Error resolving $ref pointer "${path}". \n"${withoutHash}" not found.`);
}
return $ref.resolve(absPath, options, path, pathFromRoot);
};
/**
* Returns the specified {@link $Ref} object, or undefined.
*
* @param {string} path - The path being resolved, optionally with a JSON pointer in the hash
* @returns {$Ref|undefined}
* @protected
*/
$Refs.prototype._get$Ref = function (path) {
path = url.resolve(this._root$Ref.path, path);
let withoutHash = url.stripHash(path);
return this._$refs[withoutHash];
};
/**
* Returns the encoded and decoded paths keys of the given object.
*
* @param {object} $refs - The object whose keys are URL-encoded paths
* @param {...string|string[]} [types] - Only return paths of the given types ("file", "http", etc.)
* @returns {object[]}
*/
function getPaths ($refs, types) {
let paths = Object.keys($refs);
// Filter the paths by type
types = Array.isArray(types[0]) ? types[0] : Array.prototype.slice.call(types);
if (types.length > 0 && types[0]) {
paths = paths.filter((key) => {
return types.indexOf($refs[key].pathType) !== -1;
});
}
// Decode local filesystem paths
return paths.map((path) => {
return {
encoded: path,
decoded: $refs[path].pathType === "file" ? url.toFileSystemPath(path, true) : path
};
});
}
"use strict";
const $Ref = require("./ref");
const Pointer = require("./pointer");
const parse = require("./parse");
const url = require("./util/url");
const { isHandledError } = require("./util/errors");
module.exports = resolveExternal;
/**
* Crawls the JSON schema, finds all external JSON references, and resolves their values.
* This method does not mutate the JSON schema. The resolved values are added to {@link $RefParser#$refs}.
*
* NOTE: We only care about EXTERNAL references here. INTERNAL references are only relevant when dereferencing.
*
* @param {$RefParser} parser
* @param {$RefParserOptions} options
*
* @returns {Promise}
* The promise resolves once all JSON references in the schema have been resolved,
* including nested references that are contained in externally-referenced files.
*/
function resolveExternal (parser, options) {
if (!options.resolve.external) {
// Nothing to resolve, so exit early
return Promise.resolve();
}
try {
// console.log('Resolving $ref pointers in %s', parser.$refs._root$Ref.path);
let promises = crawl(parser.schema, parser.$refs._root$Ref.path + "#", parser.$refs, options);
return Promise.all(promises);
}
catch (e) {
return Promise.reject(e);
}
}
/**
* Recursively crawls the given value, and resolves any external JSON references.
*
* @param {*} obj - The value to crawl. If it's not an object or array, it will be ignored.
* @param {string} path - The full path of `obj`, possibly with a JSON Pointer in the hash
* @param {$Refs} $refs
* @param {$RefParserOptions} options
* @param {Set} seen - Internal.
*
* @returns {Promise[]}
* Returns an array of promises. There will be one promise for each JSON reference in `obj`.
* If `obj` does not contain any JSON references, then the array will be empty.
* If any of the JSON references point to files that contain additional JSON references,
* then the corresponding promise will internally reference an array of promises.
*/
function crawl (obj, path, $refs, options, seen) {
seen = seen || new Set();
let promises = [];
if (obj && typeof obj === "object" && !ArrayBuffer.isView(obj) && !seen.has(obj)) {
seen.add(obj); // Track previously seen objects to avoid infinite recursion
if ($Ref.isExternal$Ref(obj)) {
promises.push(resolve$Ref(obj, path, $refs, options));
}
else {
for (let key of Object.keys(obj)) {
let keyPath = Pointer.join(path, key);
let value = obj[key];
if ($Ref.isExternal$Ref(value)) {
promises.push(resolve$Ref(value, keyPath, $refs, options));
}
else {
promises = promises.concat(crawl(value, keyPath, $refs, options, seen));
}
}
}
}
return promises;
}
/**
* Resolves the given JSON Reference, and then crawls the resulting value.
*
* @param {{$ref: string}} $ref - The JSON Reference to resolve
* @param {string} path - The full path of `$ref`, possibly with a JSON Pointer in the hash
* @param {$Refs} $refs
* @param {$RefParserOptions} options
*
* @returns {Promise}
* The promise resolves once all JSON references in the object have been resolved,
* including nested references that are contained in externally-referenced files.
*/
async function resolve$Ref ($ref, path, $refs, options) {
// console.log('Resolving $ref pointer "%s" at %s', $ref.$ref, path);
let resolvedPath = url.resolve(path, $ref.$ref);
let withoutHash = url.stripHash(resolvedPath);
// Do we already have this $ref?
$ref = $refs._$refs[withoutHash];
if ($ref) {
// We've already parsed this $ref, so use the existing value
return Promise.resolve($ref.value);
}
// Parse the $referenced file/url
try {
const result = await parse(resolvedPath, $refs, options);
// Crawl the parsed value
// console.log('Resolving $ref pointers in %s', withoutHash);
let promises = crawl(result, withoutHash + "#", $refs, options);
return Promise.all(promises);
}
catch (err) {
if (!options.continueOnError || !isHandledError(err)) {
throw err;
}
if ($refs._$refs[withoutHash]) {
err.source = decodeURI(url.stripHash(path));
err.path = url.safePointerToPath(url.getHash(path));
}
return [];
}
}
"use strict";
const fs = require("fs");
const { ono } = require("@jsdevtools/ono");
const url = require("../util/url");
const { ResolverError } = require("../util/errors");
module.exports = {
/**
* The order that this resolver will run, in relation to other resolvers.
*
* @type {number}
*/
order: 100,
/**
* Determines whether this resolver can read a given file reference.
* Resolvers that return true will be tried, in order, until one successfully resolves the file.
* Resolvers that return false will not be given a chance to resolve the file.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @returns {boolean}
*/
canRead (file) {
return url.isFileSystemPath(file.url);
},
/**
* Reads the given file and returns its raw contents as a Buffer.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @returns {Promise<Buffer>}
*/
read (file) {
return new Promise(((resolve, reject) => {
let path;
try {
path = url.toFileSystemPath(file.url);
}
catch (err) {
reject(new ResolverError(ono.uri(err, `Malformed URI: ${file.url}`), file.url));
}
// console.log('Opening file: %s', path);
try {
fs.readFile(path, (err, data) => {
if (err) {
reject(new ResolverError(ono(err, `Error opening file "${path}"`), path));
}
else {
resolve(data);
}
});
}
catch (err) {
reject(new ResolverError(ono(err, `Error opening file "${path}"`), path));
}
}));
}
};
"use strict";
const http = require("http");
const https = require("https");
const { ono } = require("@jsdevtools/ono");
const url = require("../util/url");
const { ResolverError } = require("../util/errors");
module.exports = {
/**
* The order that this resolver will run, in relation to other resolvers.
*
* @type {number}
*/
order: 200,
/**
* HTTP headers to send when downloading files.
*
* @example:
* {
* "User-Agent": "JSON Schema $Ref Parser",
* Accept: "application/json"
* }
*
* @type {object}
*/
headers: null,
/**
* HTTP request timeout (in milliseconds).
*
* @type {number}
*/
timeout: 5000, // 5 seconds
/**
* The maximum number of HTTP redirects to follow.
* To disable automatic following of redirects, set this to zero.
*
* @type {number}
*/
redirects: 5,
/**
* The `withCredentials` option of XMLHttpRequest.
* Set this to `true` if you're downloading files from a CORS-enabled server that requires authentication
*
* @type {boolean}
*/
withCredentials: false,
/**
* Determines whether this resolver can read a given file reference.
* Resolvers that return true will be tried in order, until one successfully resolves the file.
* Resolvers that return false will not be given a chance to resolve the file.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @returns {boolean}
*/
canRead (file) {
return url.isHttp(file.url);
},
/**
* Reads the given URL and returns its raw contents as a Buffer.
*
* @param {object} file - An object containing information about the referenced file
* @param {string} file.url - The full URL of the referenced file
* @param {string} file.extension - The lowercased file extension (e.g. ".txt", ".html", etc.)
* @returns {Promise<Buffer>}
*/
read (file) {
let u = url.parse(file.url);
if (process.browser && !u.protocol) {
// Use the protocol of the current page
u.protocol = url.parse(location.href).protocol;
}
return download(u, this);
}
};
/**
* Downloads the given file.
*
* @param {Url|string} u - The url to download (can be a parsed {@link Url} object)
* @param {object} httpOptions - The `options.resolve.http` object
* @param {number} [redirects] - The redirect URLs that have already been followed
*
* @returns {Promise<Buffer>}
* The promise resolves with the raw downloaded data, or rejects if there is an HTTP error.
*/
function download (u, httpOptions, redirects) {
return new Promise(((resolve, reject) => {
u = url.parse(u);
redirects = redirects || [];
redirects.push(u.href);
get(u, httpOptions)
.then((res) => {
if (res.statusCode >= 400) {
throw ono({ status: res.statusCode }, `HTTP ERROR ${res.statusCode}`);
}
else if (res.statusCode >= 300) {
if (redirects.length > httpOptions.redirects) {
reject(new ResolverError(ono({ status: res.statusCode },
`Error downloading ${redirects[0]}. \nToo many redirects: \n ${redirects.join(" \n ")}`)));
}
else if (!res.headers.location) {
throw ono({ status: res.statusCode }, `HTTP ${res.statusCode} redirect with no location header`);
}
else {
// console.log('HTTP %d redirect %s -> %s', res.statusCode, u.href, res.headers.location);
let redirectTo = url.resolve(u, res.headers.location);
download(redirectTo, httpOptions, redirects).then(resolve, reject);
}
}
else {
resolve(res.body || Buffer.alloc(0));
}
})
.catch((err) => {
reject(new ResolverError(ono(err, `Error downloading ${u.href}`), u.href));
});
}));
}
/**
* Sends an HTTP GET request.
*
* @param {Url} u - A parsed {@link Url} object
* @param {object} httpOptions - The `options.resolve.http` object
*
* @returns {Promise<Response>}
* The promise resolves with the HTTP Response object.
*/
function get (u, httpOptions) {
return new Promise(((resolve, reject) => {
// console.log('GET', u.href);
let protocol = u.protocol === "https:" ? https : http;
let req = protocol.get({
hostname: u.hostname,
port: u.port,
path: u.path,
auth: u.auth,
protocol: u.protocol,
headers: httpOptions.headers || {},
withCredentials: httpOptions.withCredentials
});
if (typeof req.setTimeout === "function") {
req.setTimeout(httpOptions.timeout);
}
req.on("timeout", () => {
req.abort();
});
req.on("error", reject);
req.once("response", (res) => {
res.body = Buffer.alloc(0);
res.on("data", (data) => {
res.body = Buffer.concat([res.body, Buffer.from(data)]);
});
res.on("error", reject);
res.on("end", () => {
resolve(res);
});
});
}));
}
"use strict";
const { Ono } = require("@jsdevtools/ono");
const { stripHash, toFileSystemPath } = require("./url");
const JSONParserError = exports.JSONParserError = class JSONParserError extends Error {
constructor (message, source) {
super();
this.code = "EUNKNOWN";
this.message = message;
this.source = source;
this.path = null;
Ono.extend(this);
}
get footprint () {
return `${this.path}+${this.source}+${this.code}+${this.message}`;
}
};
setErrorName(JSONParserError);
const JSONParserErrorGroup = exports.JSONParserErrorGroup = class JSONParserErrorGroup extends Error {
constructor (parser) {
super();
this.files = parser;
this.message = `${this.errors.length} error${this.errors.length > 1 ? "s" : ""} occurred while reading '${toFileSystemPath(parser.$refs._root$Ref.path)}'`;
Ono.extend(this);
}
static getParserErrors (parser) {
const errors = [];
for (const $ref of Object.values(parser.$refs._$refs)) {
if ($ref.errors) {
errors.push(...$ref.errors);
}
}
return errors;
}
get errors () {
return JSONParserErrorGroup.getParserErrors(this.files);
}
};
setErrorName(JSONParserErrorGroup);
const ParserError = exports.ParserError = class ParserError extends JSONParserError {
constructor (message, source) {
super(`Error parsing ${source}: ${message}`, source);
this.code = "EPARSER";
}
};
setErrorName(ParserError);
const UnmatchedParserError = exports.UnmatchedParserError = class UnmatchedParserError extends JSONParserError {
constructor (source) {
super(`Could not find parser for "${source}"`, source);
this.code = "EUNMATCHEDPARSER";
}
};
setErrorName(UnmatchedParserError);
const ResolverError = exports.ResolverError = class ResolverError extends JSONParserError {
constructor (ex, source) {
super(ex.message || `Error reading file "${source}"`, source);
this.code = "ERESOLVER";
if ("code" in ex) {
this.ioErrorCode = String(ex.code);
}
}
};
setErrorName(ResolverError);
const UnmatchedResolverError = exports.UnmatchedResolverError = class UnmatchedResolverError extends JSONParserError {
constructor (source) {
super(`Could not find resolver for "${source}"`, source);
this.code = "EUNMATCHEDRESOLVER";
}
};
setErrorName(UnmatchedResolverError);
const MissingPointerError = exports.MissingPointerError = class MissingPointerError extends JSONParserError {
constructor (token, path) {
super(`Token "${token}" does not exist.`, stripHash(path));
this.code = "EMISSINGPOINTER";
}
};
setErrorName(MissingPointerError);
const InvalidPointerError = exports.InvalidPointerError = class InvalidPointerError extends JSONParserError {
constructor (pointer, path) {
super(`Invalid $ref pointer "${pointer}". Pointers must begin with "#/"`, stripHash(path));
this.code = "EINVALIDPOINTER";
}
};
setErrorName(InvalidPointerError);
function setErrorName (err) {
Object.defineProperty(err.prototype, "name", {
value: err.name,
enumerable: true,
});
}
exports.isHandledError = function (err) {
return err instanceof JSONParserError || err instanceof JSONParserErrorGroup;
};
exports.normalizeError = function (err) {
if (err.path === null) {
err.path = [];
}
return err;
};
"use strict";
/**
* Returns the given plugins as an array, rather than an object map.
* All other methods in this module expect an array of plugins rather than an object map.
*
* @param {object} plugins - A map of plugin objects
* @return {object[]}
*/
exports.all = function (plugins) {
return Object.keys(plugins)
.filter((key) => {
return typeof plugins[key] === "object";
})
.map((key) => {
plugins[key].name = key;
return plugins[key];
});
};
/**
* Filters the given plugins, returning only the ones return `true` for the given method.
*
* @param {object[]} plugins - An array of plugin objects
* @param {string} method - The name of the filter method to invoke for each plugin
* @param {object} file - A file info object, which will be passed to each method
* @return {object[]}
*/
exports.filter = function (plugins, method, file) {
return plugins
.filter((plugin) => {
return !!getResult(plugin, method, file);
});
};
/**
* Sorts the given plugins, in place, by their `order` property.
*
* @param {object[]} plugins - An array of plugin objects
* @returns {object[]}
*/
exports.sort = function (plugins) {
for (let plugin of plugins) {
plugin.order = plugin.order || Number.MAX_SAFE_INTEGER;
}
return plugins.sort((a, b) => { return a.order - b.order; });
};
/**
* Runs the specified method of the given plugins, in order, until one of them returns a successful result.
* Each method can return a synchronous value, a Promise, or call an error-first callback.
* If the promise resolves successfully, or the callback is called without an error, then the result
* is immediately returned and no further plugins are called.
* If the promise rejects, or the callback is called with an error, then the next plugin is called.
* If ALL plugins fail, then the last error is thrown.
*
* @param {object[]} plugins - An array of plugin objects
* @param {string} method - The name of the method to invoke for each plugin
* @param {object} file - A file info object, which will be passed to each method
* @returns {Promise}
*/
exports.run = function (plugins, method, file, $refs) {
let plugin, lastError, index = 0;
return new Promise(((resolve, reject) => {
runNextPlugin();
function runNextPlugin () {
plugin = plugins[index++];
if (!plugin) {
// There are no more functions, so re-throw the last error
return reject(lastError);
}
try {
// console.log(' %s', plugin.name);
let result = getResult(plugin, method, file, callback, $refs);
if (result && typeof result.then === "function") {
// A promise was returned
result.then(onSuccess, onError);
}
else if (result !== undefined) {
// A synchronous result was returned
onSuccess(result);
}
else if (index === plugins.length) {
throw new Error("No promise has been returned or callback has been called.");
}
}
catch (e) {
onError(e);
}
}
function callback (err, result) {
if (err) {
onError(err);
}
else {
onSuccess(result);
}
}
function onSuccess (result) {
// console.log(' success');
resolve({
plugin,
result
});
}
function onError (error) {
// console.log(' %s', err.message || err);
lastError = {
plugin,
error,
};
runNextPlugin();
}
}));
};
/**
* Returns the value of the given property.
* If the property is a function, then the result of the function is returned.
* If the value is a RegExp, then it will be tested against the file URL.
* If the value is an aray, then it will be compared against the file extension.
*
* @param {object} obj - The object whose property/method is called
* @param {string} prop - The name of the property/method to invoke
* @param {object} file - A file info object, which will be passed to the method
* @param {function} [callback] - A callback function, which will be passed to the method
* @returns {*}
*/
function getResult (obj, prop, file, callback, $refs) {
let value = obj[prop];
if (typeof value === "function") {
return value.apply(obj, [file, callback, $refs]);
}
if (!callback) {
// The synchronous plugin functions (canParse and canRead)
// allow a "shorthand" syntax, where the user can match
// files by RegExp or by file extension.
if (value instanceof RegExp) {
return value.test(file.url);
}
else if (typeof value === "string") {
return value === file.extension;
}
else if (Array.isArray(value)) {
return value.indexOf(file.extension) !== -1;
}
}
return value;
}
"use strict";
let isWindows = /^win/.test(process.platform),
forwardSlashPattern = /\//g,
protocolPattern = /^(\w{2,}):\/\//i,
url = module.exports,
jsonPointerSlash = /~1/g,
jsonPointerTilde = /~0/g;
// RegExp patterns to URL-encode special characters in local filesystem paths
let urlEncodePatterns = [
/\?/g, "%3F",
/\#/g, "%23",
];
// RegExp patterns to URL-decode special characters for local filesystem paths
let urlDecodePatterns = [
/\%23/g, "#",
/\%24/g, "$",
/\%26/g, "&",
/\%2C/g, ",",
/\%40/g, "@"
];
exports.parse = require("url").parse;
exports.resolve = require("url").resolve;
/**
* Returns the current working directory (in Node) or the current page URL (in browsers).
*
* @returns {string}
*/
exports.cwd = function cwd () {
if (process.browser) {
return location.href;
}
let path = process.cwd();
let lastChar = path.slice(-1);
if (lastChar === "/" || lastChar === "\\") {
return path;
}
else {
return path + "/";
}
};
/**
* Returns the protocol of the given URL, or `undefined` if it has no protocol.
*
* @param {string} path
* @returns {?string}
*/
exports.getProtocol = function getProtocol (path) {
let match = protocolPattern.exec(path);
if (match) {
return match[1].toLowerCase();
}
};
/**
* Returns the lowercased file extension of the given URL,
* or an empty string if it has no extension.
*
* @param {string} path
* @returns {string}
*/
exports.getExtension = function getExtension (path) {
let lastDot = path.lastIndexOf(".");
if (lastDot >= 0) {
return url.stripQuery(path.substr(lastDot).toLowerCase());
}
return "";
};
/**
* Removes the query, if any, from the given path.
*
* @param {string} path
* @returns {string}
*/
exports.stripQuery = function stripQuery (path) {
let queryIndex = path.indexOf("?");
if (queryIndex >= 0) {
path = path.substr(0, queryIndex);
}
return path;
};
/**
* Returns the hash (URL fragment), of the given path.
* If there is no hash, then the root hash ("#") is returned.
*
* @param {string} path
* @returns {string}
*/
exports.getHash = function getHash (path) {
let hashIndex = path.indexOf("#");
if (hashIndex >= 0) {
return path.substr(hashIndex);
}
return "#";
};
/**
* Removes the hash (URL fragment), if any, from the given path.
*
* @param {string} path
* @returns {string}
*/
exports.stripHash = function stripHash (path) {
let hashIndex = path.indexOf("#");
if (hashIndex >= 0) {
path = path.substr(0, hashIndex);
}
return path;
};
/**
* Determines whether the given path is an HTTP(S) URL.
*
* @param {string} path
* @returns {boolean}
*/
exports.isHttp = function isHttp (path) {
let protocol = url.getProtocol(path);
if (protocol === "http" || protocol === "https") {
return true;
}
else if (protocol === undefined) {
// There is no protocol. If we're running in a browser, then assume it's HTTP.
return process.browser;
}
else {
// It's some other protocol, such as "ftp://", "mongodb://", etc.
return false;
}
};
/**
* Determines whether the given path is a filesystem path.
* This includes "file://" URLs.
*
* @param {string} path
* @returns {boolean}
*/
exports.isFileSystemPath = function isFileSystemPath (path) {
if (process.browser) {
// We're running in a browser, so assume that all paths are URLs.
// This way, even relative paths will be treated as URLs rather than as filesystem paths
return false;
}
let protocol = url.getProtocol(path);
return protocol === undefined || protocol === "file";
};
/**
* Converts a filesystem path to a properly-encoded URL.
*
* This is intended to handle situations where JSON Schema $Ref Parser is called
* with a filesystem path that contains characters which are not allowed in URLs.
*
* @example
* The following filesystem paths would be converted to the following URLs:
*
* <"!@#$%^&*+=?'>.json ==> %3C%22!@%23$%25%5E&*+=%3F\'%3E.json
* C:\\My Documents\\File (1).json ==> C:/My%20Documents/File%20(1).json
* file://Project #42/file.json ==> file://Project%20%2342/file.json
*
* @param {string} path
* @returns {string}
*/
exports.fromFileSystemPath = function fromFileSystemPath (path) {
// Step 1: On Windows, replace backslashes with forward slashes,
// rather than encoding them as "%5C"
if (isWindows) {
path = path.replace(/\\/g, "/");
}
// Step 2: `encodeURI` will take care of MOST characters
path = encodeURI(path);
// Step 3: Manually encode characters that are not encoded by `encodeURI`.
// This includes characters such as "#" and "?", which have special meaning in URLs,
// but are just normal characters in a filesystem path.
for (let i = 0; i < urlEncodePatterns.length; i += 2) {
path = path.replace(urlEncodePatterns[i], urlEncodePatterns[i + 1]);
}
return path;
};
/**
* Converts a URL to a local filesystem path.
*
* @param {string} path
* @param {boolean} [keepFileProtocol] - If true, then "file://" will NOT be stripped
* @returns {string}
*/
exports.toFileSystemPath = function toFileSystemPath (path, keepFileProtocol) {
// Step 1: `decodeURI` will decode characters such as Cyrillic characters, spaces, etc.
path = decodeURI(path);
// Step 2: Manually decode characters that are not decoded by `decodeURI`.
// This includes characters such as "#" and "?", which have special meaning in URLs,
// but are just normal characters in a filesystem path.
for (let i = 0; i < urlDecodePatterns.length; i += 2) {
path = path.replace(urlDecodePatterns[i], urlDecodePatterns[i + 1]);
}
// Step 3: If it's a "file://" URL, then format it consistently
// or convert it to a local filesystem path
let isFileUrl = path.substr(0, 7).toLowerCase() === "file://";
if (isFileUrl) {
// Strip-off the protocol, and the initial "/", if there is one
path = path[7] === "/" ? path.substr(8) : path.substr(7);
// insert a colon (":") after the drive letter on Windows
if (isWindows && path[1] === "/") {
path = path[0] + ":" + path.substr(1);
}
if (keepFileProtocol) {
// Return the consistently-formatted "file://" URL
path = "file:///" + path;
}
else {
// Convert the "file://" URL to a local filesystem path.
// On Windows, it will start with something like "C:/".
// On Posix, it will start with "/"
isFileUrl = false;
path = isWindows ? path : "/" + path;
}
}
// Step 4: Normalize Windows paths (unless it's a "file://" URL)
if (isWindows && !isFileUrl) {
// Replace forward slashes with backslashes
path = path.replace(forwardSlashPattern, "\\");
// Capitalize the drive letter
if (path.substr(1, 2) === ":\\") {
path = path[0].toUpperCase() + path.substr(1);
}
}
return path;
};
/**
* Converts a $ref pointer to a valid JSON Path.
*
* @param {string} pointer
* @returns {Array<number | string>}
*/
exports.safePointerToPath = function safePointerToPath (pointer) {
if (pointer.length <= 1 || pointer[0] !== "#" || pointer[1] !== "/") {
return [];
}
return pointer
.slice(2)
.split("/")
.map((value) => {
return decodeURIComponent(value)
.replace(jsonPointerSlash, "/")
.replace(jsonPointerTilde, "~");
});
};
{
"name": "@apidevtools/json-schema-ref-parser",
"version": "9.1.0",
"description": "Parse, Resolve, and Dereference JSON Schema $ref pointers",
"keywords": [
"json",
"schema",
"jsonschema",
"json-schema",
"json-pointer",
"$ref",
"dereference",
"resolve"
],
"author": {
"name": "James Messinger",
"url": "https://jamesmessinger.com"
},
"contributors": [
{
"name": "Boris Cherny",
"email": "boris@performancejs.com"
},
{
"name": "Jakub Rożek",
"email": "jakub@stoplight.io"
}
],
"homepage": "https://apitools.dev/json-schema-ref-parser/",
"repository": {
"type": "git",
"url": "https://github.com/APIDevTools/json-schema-ref-parser.git"
},
"license": "MIT",
"main": "lib/index.js",
"typings": "lib/index.d.ts",
"browser": {
"fs": false
},
"files": [
"lib"
],
"scripts": {
"build": "cp LICENSE *.md dist",
"clean": "shx rm -rf .nyc_output coverage",
"lint": "eslint lib test/fixtures test/specs",
"test": "npm run test:node && npm run test:typescript && npm run test:browser && npm run lint",
"test:node": "mocha",
"test:browser": "karma start --single-run",
"test:typescript": "tsc --noEmit --strict --lib esnext,dom test/specs/typescript-definition.spec.ts",
"coverage": "npm run coverage:node && npm run coverage:browser",
"coverage:node": "nyc node_modules/mocha/bin/mocha",
"coverage:browser": "npm run test:browser -- --coverage",
"upgrade": "npm-check -u && npm audit fix"
},
"devDependencies": {
"@amanda-mitchell/semantic-release-npm-multiple": "^2.5.0",
"@babel/polyfill": "^7.12.1",
"@jsdevtools/eslint-config": "^1.0.7",
"@jsdevtools/host-environment": "^2.1.2",
"@jsdevtools/karma-config": "^3.1.7",
"@types/node": "^14.14.21",
"chai-subset": "^1.6.0",
"chai": "^4.2.0",
"eslint": "^7.18.0",
"karma-cli": "^2.0.0",
"karma": "^5.0.2",
"mocha": "^8.2.1",
"npm-check": "^5.9.0",
"nyc": "^15.0.1",
"semantic-release-plugin-update-version-in-files": "^1.1.0",
"shx": "^0.3.2",
"typescript": "^4.0.5"
},
"dependencies": {
"@jsdevtools/ono": "^7.1.3",
"@types/json-schema": "^7.0.6",
"call-me-maybe": "^1.0.1",
"js-yaml": "^4.1.0"
},
"release": {
"branches": [
"main",
"v9"
],
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
[
"semantic-release-plugin-update-version-in-files",
{
"files": [
"dist/package.json"
],
"placeholder": "X.X.X"
}
],
[
"@amanda-mitchell/semantic-release-npm-multiple",
{
"registries": {
"scoped": {
"pkgRoot": "."
},
"unscoped": {
"pkgRoot": "dist"
}
}
}
],
"@semantic-release/github"
]
}
}
Change Log
====================================================================================================
All notable changes will be documented in this file.
OpenAPI Schemas adheres to [Semantic Versioning](http://semver.org/).
[v2.0.0](https://github.com/APIDevTools/openapi-schemas/tree/v2.0.0) (2020-03-10)
----------------------------------------------------------------------------------------------------
- Moved OpenAPI Schemas to the [@APIDevTools scope](https://www.npmjs.com/org/apidevtools) on NPM
- The "openapi-schemas" NPM package is now just a wrapper around the scoped "@apidevtools/openapi-schemas" package
[Full Changelog](https://github.com/APIDevTools/openapi-schemas/compare/v1.0.3...v2.0.0)
[v1.0.0](https://github.com/APIDevTools/openapi-schemas/tree/v1.0.0) (2019-06-22)
----------------------------------------------------------------------------------------------------
Initial release 🎉
The MIT License (MIT)
Copyright (c) 2019 James Messinger
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
# OpenAPI Specification Schemas
[![Cross-Platform Compatibility](https://apitools.dev/img/badges/os-badges.svg)](https://github.com/APIDevTools/openapi-schemas/actions)
[![Build Status](https://github.com/APIDevTools/openapi-schemas/workflows/CI-CD/badge.svg?branch=master)](https://github.com/APIDevTools/openapi-schemas/actions)
[![Coverage Status](https://coveralls.io/repos/github/APIDevTools/openapi-schemas/badge.svg?branch=master)](https://coveralls.io/github/APIDevTools/openapi-schemas)
[![Dependencies](https://david-dm.org/APIDevTools/openapi-schemas.svg)](https://david-dm.org/APIDevTools/openapi-schemas)
[![npm](https://img.shields.io/npm/v/@apidevtools/openapi-schemas.svg)](https://www.npmjs.com/package/@apidevtools/openapi-schemas)
[![License](https://img.shields.io/npm/l/@apidevtools/openapi-schemas.svg)](LICENSE)
[![Buy us a tree](https://img.shields.io/badge/Treeware-%F0%9F%8C%B3-lightgreen)](https://plant.treeware.earth/APIDevTools/openapi-schemas)
This package contains [**the official JSON Schemas**](https://github.com/OAI/OpenAPI-Specification/tree/master/schemas) for every version of Swagger/OpenAPI Specification:
| Version | Schema | Docs
|---------|--------|-------
| Swagger 1.2 | [v1.2 schema](https://github.com/OAI/OpenAPI-Specification/tree/master/schemas/v1.2) | [v1.2 docs](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/1.2.md)
| Swagger 2.0 | [v2.0 schema](https://github.com/OAI/OpenAPI-Specification/blob/master/schemas/v2.0/schema.json) | [v2.0 docs](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/2.0.md)
| OpenAPI 3.0.x | [v3.0.x schema](https://github.com/OAI/OpenAPI-Specification/blob/master/schemas/v3.0/schema.json) | [v3.0.3 docs](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.3.md)
| OpenAPI 3.1.x | [v3.1.x schema](https://github.com/OAI/OpenAPI-Specification/blob/master/schemas/v3.1/schema.json) | [v3.1.0 docs](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.1.0.md)
All schemas are kept up-to-date with the latest official definitions via an automated CI/CD job. 🤖📦
Installation
--------------------------
You can install OpenAPI Schemas via [npm](https://docs.npmjs.com/about-npm/).
```bash
npm install @apidevtools/openapi-schemas
```
Usage
--------------------------
The default export contains all OpenAPI Specification versions:
```javascript
const openapi = require("@apidevtools/openapi-schemas");
console.log(openapi.v1); // { $schema, id, properties, definitions, ... }
console.log(openapi.v2); // { $schema, id, properties, definitions, ... }
console.log(openapi.v3); // { $schema, id, properties, definitions, ... }
console.log(openapi.v31); // { $schema, id, properties, definitions, ... }
```
Or you can import the specific version(s) that you need:
```javascript
const { openapiV1, openapiV2, openapiV3, openapiV31 } = require("@apidevtools/openapi-schemas");
console.log(openapiV1); // { $schema, id, properties, definitions, ... }
console.log(openapiV2); // { $schema, id, properties, definitions, ... }
console.log(openapiV3); // { $schema, id, properties, definitions, ... }
console.log(openapiV31); // { $schema, id, properties, definitions, ... }
```
You can use a JSON Schema validator such as [Z-Schema](https://www.npmjs.com/package/z-schema) or [AJV](https://www.npmjs.com/package/ajv) to validate OpenAPI definitions against the specification.
```javascript
const { openapiV31 } = require("@apidevtools/openapi-schemas");
const ZSchema = require("z-schema");
// Create a ZSchema validator
let validator = new ZSchema();
// Validate an OpenAPI definition against the OpenAPI v3.0 specification
validator.validate(openapiDefinition, openapiV31);
```
Contributing
--------------------------
Contributions, enhancements, and bug-fixes are welcome! [Open an issue](https://github.com/APIDevTools/openapi-schemas/issues) on GitHub and [submit a pull request](https://github.com/APIDevTools/openapi-schemas/pulls).
#### Building
To build the project locally on your computer:
1. __Clone this repo__<br>
`git clone https://github.com/APIDevTools/openapi-schemas.git`
2. __Install dependencies__<br>
`npm install`
3. __Build the code__<br>
`npm run build`
4. __Run the tests__<br>
`npm test`
License
--------------------------
OpenAPI Schemas is 100% free and open-source, under the [MIT license](LICENSE). Use it however you want.
This package is [Treeware](http://treeware.earth). If you use it in production, then we ask that you [**buy the world a tree**](https://plant.treeware.earth/APIDevTools/openapi-schemas) to thank us for our work. By contributing to the Treeware forest you’ll be creating employment for local families and restoring wildlife habitats.
Big Thanks To
--------------------------
Thanks to these awesome companies for their support of Open Source developers ❤
[![GitHub](https://apitools.dev/img/badges/github.svg)](https://github.com/open-source)
[![NPM](https://apitools.dev/img/badges/npm.svg)](https://www.npmjs.com/)
[![Coveralls](https://apitools.dev/img/badges/coveralls.svg)](https://coveralls.io)
[![Travis CI](https://apitools.dev/img/badges/travis-ci.svg)](https://travis-ci.com)
[![SauceLabs](https://apitools.dev/img/badges/sauce-labs.svg)](https://saucelabs.com)
import { JsonSchemaDraft4, JsonSchemaDraft202012 } from "./json-schema";
export { JsonSchemaDraft4, JsonSchemaDraft202012 };
/**
* JSON Schema for OpenAPI Specification v1.2
*/
export declare const openapiV1: JsonSchemaDraft4;
/**
* JSON Schema for OpenAPI Specification v2.0
*/
export declare const openapiV2: JsonSchemaDraft4;
/**
* JSON Schema for OpenAPI Specification v3.0
*/
export declare const openapiV3: JsonSchemaDraft4;
/**
* JSON Schema for OpenAPI Specification v3.1
*/
export declare const openapiV31: JsonSchemaDraft202012;
/**
* JSON Schemas for every version of the OpenAPI Specification
*/
export declare const openapi: {
v1: JsonSchemaDraft4;
v2: JsonSchemaDraft4;
v3: JsonSchemaDraft4;
v31: JsonSchemaDraft202012;
};
export default openapi;
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.openapi = exports.openapiV31 = exports.openapiV3 = exports.openapiV2 = exports.openapiV1 = void 0;
/**
* JSON Schema for OpenAPI Specification v1.2
*/
exports.openapiV1 = require("../schemas/v1.2/apiDeclaration.json");
/**
* JSON Schema for OpenAPI Specification v2.0
*/
exports.openapiV2 = require("../schemas/v2.0/schema.json");
/**
* JSON Schema for OpenAPI Specification v3.0
*/
exports.openapiV3 = require("../schemas/v3.0/schema.json");
/**
* JSON Schema for OpenAPI Specification v3.1
*/
exports.openapiV31 = require("../schemas/v3.1/schema.json");
/**
* JSON Schemas for every version of the OpenAPI Specification
*/
exports.openapi = {
v1: exports.openapiV1,
v2: exports.openapiV2,
v3: exports.openapiV3,
v31: exports.openapiV31,
};
// Export `openapi` as the default export
exports.default = exports.openapi;
// CommonJS default export hack
/* eslint-env commonjs */
if (typeof module === "object" && typeof module.exports === "object") {
module.exports = Object.assign(module.exports.default, module.exports);
}
//# sourceMappingURL=index.js.map
\ No newline at end of file
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AAIA;;GAEG;AACU,QAAA,SAAS,GAAG,OAAO,CAAC,qCAAqC,CAAqB,CAAC;AAE5F;;GAEG;AACU,QAAA,SAAS,GAAG,OAAO,CAAC,6BAA6B,CAAqB,CAAC;AAEpF;;GAEG;AACU,QAAA,SAAS,GAAG,OAAO,CAAC,6BAA6B,CAAqB,CAAC;AAEpF;;GAEG;AACU,QAAA,UAAU,GAAG,OAAO,CAAC,6BAA6B,CAA0B,CAAC;AAE1F;;GAEG;AACU,QAAA,OAAO,GAAG;IACrB,EAAE,EAAE,iBAAS;IACb,EAAE,EAAE,iBAAS;IACb,EAAE,EAAE,iBAAS;IACb,GAAG,EAAE,kBAAU;CAChB,CAAC;AAEF,yCAAyC;AACzC,kBAAe,eAAO,CAAC;AAEvB,+BAA+B;AAC/B,yBAAyB;AACzB,IAAI,OAAO,MAAM,KAAK,QAAQ,IAAI,OAAO,MAAM,CAAC,OAAO,KAAK,QAAQ,EAAE;IACpE,MAAM,CAAC,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,OAAO,EAAE,MAAM,CAAC,OAAO,CAAC,CAAC;CACxE"}
\ No newline at end of file
/**
* A JSON Schema 4.0 definition for an OpenAPI Specification
*/
export interface JsonSchemaDraft4 {
id?: string;
$schema?: string;
title?: string;
description?: string;
multipleOf?: number;
maximum?: number;
exclusiveMaximum?: boolean;
minimum?: number;
exclusiveMinimum?: boolean;
maxLength?: number;
minLength?: number;
pattern?: string;
additionalItems?: boolean | JsonSchemaDraft4;
items?: JsonSchemaDraft4 | JsonSchemaDraft4[];
maxItems?: number;
minItems?: number;
uniqueItems?: boolean;
maxProperties?: number;
minProperties?: number;
required?: string[];
additionalProperties?: boolean | JsonSchemaDraft4;
definitions?: {
[name: string]: JsonSchemaDraft4;
};
properties?: {
[name: string]: JsonSchemaDraft4;
};
patternProperties?: {
[name: string]: JsonSchemaDraft4;
};
dependencies?: {
[name: string]: JsonSchemaDraft4 | string[];
};
enum?: string[];
type?: string | string[];
allOf?: JsonSchemaDraft4[];
anyOf?: JsonSchemaDraft4[];
oneOf?: JsonSchemaDraft4[];
not?: JsonSchemaDraft4;
}
/**
* A JSON Schema 2020-12 definition for an OpenAPI Specification
*/
export interface JsonSchemaDraft202012 {
$id?: string;
$schema?: string;
title?: string;
description?: string;
multipleOf?: number;
maximum?: number;
exclusiveMaximum?: boolean;
minimum?: number;
exclusiveMinimum?: boolean;
maxLength?: number;
minLength?: number;
pattern?: string;
additionalItems?: boolean | JsonSchemaDraft202012;
items?: JsonSchemaDraft202012 | JsonSchemaDraft202012[];
maxItems?: number;
minItems?: number;
uniqueItems?: boolean;
maxProperties?: number;
minProperties?: number;
required?: string[];
additionalProperties?: boolean | JsonSchemaDraft202012;
$defs?: {
[name: string]: JsonSchemaDraft202012;
};
properties?: {
[name: string]: JsonSchemaDraft202012;
};
patternProperties?: {
[name: string]: JsonSchemaDraft202012;
};
dependencies?: {
[name: string]: JsonSchemaDraft202012 | string[];
};
enum?: string[];
type?: string | string[];
allOf?: JsonSchemaDraft202012[];
anyOf?: JsonSchemaDraft202012[];
oneOf?: JsonSchemaDraft202012[];
not?: JsonSchemaDraft202012;
}
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
//# sourceMappingURL=json-schema.js.map
\ No newline at end of file
{"version":3,"file":"json-schema.js","sourceRoot":"","sources":["../src/json-schema.ts"],"names":[],"mappings":""}
\ No newline at end of file
{
"name": "@apidevtools/openapi-schemas",
"version": "2.1.0",
"description": "JSON Schemas for every version of the OpenAPI Specification",
"keywords": [
"openapi",
"open-api",
"swagger",
"oas",
"api",
"rest",
"json",
"specification",
"definition",
"schema"
],
"author": {
"name": "James Messinger",
"url": "https://jamesmessinger.com"
},
"license": "MIT",
"homepage": "https://apitools.dev/openapi-schemas",
"repository": {
"type": "git",
"url": "https://github.com/APIDevTools/openapi-schemas.git"
},
"main": "lib/index.js",
"types": "lib/index.d.ts",
"files": [
"lib",
"schemas"
],
"scripts": {
"clean": "shx rm -rf .nyc_output coverage lib .tmp schemas",
"clone": "git clone https://github.com/OAI/OpenAPI-Specification.git .tmp",
"copy": "shx cp -r .tmp/schemas schemas",
"lint": "eslint src test",
"build": "npm run build:schemas && npm run build:typescript",
"build:schemas": "npm run clean && npm run clone && npm run copy",
"build:typescript": "tsc",
"watch": "tsc --watch",
"test": "mocha && npm run lint",
"coverage": "nyc node_modules/mocha/bin/mocha",
"upgrade": "npm-check -u && npm audit fix",
"bump": "bump --tag --push --all",
"release": "npm run upgrade && npm run clean && npm run build && npm test && npm run bump"
},
"engines": {
"node": ">=10"
},
"devDependencies": {
"@jsdevtools/eslint-config": "^1.1.4",
"@jsdevtools/version-bump-prompt": "^6.1.0",
"@types/chai": "^4.2.17",
"@types/command-line-args": "^5.0.0",
"@types/mocha": "^8.2.2",
"@types/node": "^15.0.1",
"chai": "^4.3.4",
"eslint": "^7.25.0",
"mocha": "^8.3.2",
"npm-check": "^5.9.2",
"nyc": "^15.1.0",
"shx": "^0.3.3",
"typescript": "^4.2.4"
}
}
# Swagger Specification JSON Schemas
The work on the JSON Schema for the Swagger Specification was donated to the community by [Francis Galiegue](https://github.com/fge)!
Keep in mind that due to some JSON Schema limitations, not all constraints can be described. The missing constraints will be listed here in the future.
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/apiDeclaration.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"required": [ "swaggerVersion", "basePath", "apis" ],
"properties": {
"swaggerVersion": { "enum": [ "1.2" ] },
"apiVersion": { "type": "string" },
"basePath": {
"type": "string",
"format": "uri",
"pattern": "^https?://"
},
"resourcePath": {
"type": "string",
"format": "uri",
"pattern": "^/"
},
"apis": {
"type": "array",
"items": { "$ref": "#/definitions/apiObject" }
},
"models": {
"type": "object",
"additionalProperties": {
"$ref": "modelsObject.json#"
}
},
"produces": { "$ref": "#/definitions/mimeTypeArray" },
"consumes": { "$ref": "#/definitions/mimeTypeArray" },
"authorizations": { "$ref": "authorizationObject.json#" }
},
"additionalProperties": false,
"definitions": {
"apiObject": {
"type": "object",
"required": [ "path", "operations" ],
"properties": {
"path": {
"type": "string",
"format": "uri-template",
"pattern": "^/"
},
"description": { "type": "string" },
"operations": {
"type": "array",
"items": { "$ref": "operationObject.json#" }
}
},
"additionalProperties": false
},
"mimeTypeArray": {
"type": "array",
"items": {
"type": "string",
"format": "mime-type"
},
"uniqueItems": true
}
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/authorizationObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"$ref": "#/definitions/basicAuth"
},
{
"$ref": "#/definitions/apiKey"
},
{
"$ref": "#/definitions/oauth2"
}
]
},
"definitions": {
"basicAuth": {
"required": [ "type" ],
"properties": {
"type": { "enum": [ "basicAuth" ] }
},
"additionalProperties": false
},
"apiKey": {
"required": [ "type", "passAs", "keyname" ],
"properties": {
"type": { "enum": [ "apiKey" ] },
"passAs": { "enum": [ "header", "query" ] },
"keyname": { "type": "string" }
},
"additionalProperties": false
},
"oauth2": {
"type": "object",
"required": [ "type", "grantTypes" ],
"properties": {
"type": { "enum": [ "oauth2" ] },
"scopes": {
"type": "array",
"items": { "$ref": "#/definitions/oauth2Scope" }
},
"grantTypes": { "$ref": "oauth2GrantType.json#" }
},
"additionalProperties": false
},
"oauth2Scope": {
"type": "object",
"required": [ "scope" ],
"properties": {
"scope": { "type": "string" },
"description": { "type": "string" }
},
"additionalProperties": false
}
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/dataType.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Data type as described by the specification (version 1.2)",
"type": "object",
"oneOf": [
{ "$ref": "#/definitions/refType" },
{ "$ref": "#/definitions/voidType" },
{ "$ref": "#/definitions/primitiveType" },
{ "$ref": "#/definitions/modelType" },
{ "$ref": "#/definitions/arrayType" }
],
"definitions": {
"refType": {
"required": [ "$ref" ],
"properties": {
"$ref": { "type": "string" }
},
"additionalProperties": false
},
"voidType": {
"enum": [ { "type": "void" } ]
},
"modelType": {
"required": [ "type" ],
"properties": {
"type": {
"type": "string",
"not": {
"enum": [ "boolean", "integer", "number", "string", "array" ]
}
}
},
"additionalProperties": false
},
"primitiveType": {
"required": [ "type" ],
"properties": {
"type": {
"enum": [ "boolean", "integer", "number", "string" ]
},
"format": { "type": "string" },
"defaultValue": {
"not": { "type": [ "array", "object", "null" ] }
},
"enum": {
"type": "array",
"items": { "type": "string" },
"minItems": 1,
"uniqueItems": true
},
"minimum": { "type": "string" },
"maximum": { "type": "string" }
},
"additionalProperties": false,
"dependencies": {
"format": {
"oneOf": [
{
"properties": {
"type": { "enum": [ "integer" ] },
"format": { "enum": [ "int32", "int64" ] }
}
},
{
"properties": {
"type": { "enum": [ "number" ] },
"format": { "enum": [ "float", "double" ] }
}
},
{
"properties": {
"type": { "enum": [ "string" ] },
"format": {
"enum": [ "byte", "date", "date-time" ]
}
}
}
]
},
"enum": {
"properties": {
"type": { "enum": [ "string" ] }
}
},
"minimum": {
"properties": {
"type": { "enum": [ "integer", "number" ] }
}
},
"maximum": {
"properties": {
"type": { "enum": [ "integer", "number" ] }
}
}
}
},
"arrayType": {
"required": [ "type", "items" ],
"properties": {
"type": { "enum": [ "array" ] },
"items": {
"type": "array",
"items": { "$ref": "#/definitions/itemsObject" }
},
"uniqueItems": { "type": "boolean" }
},
"additionalProperties": false
},
"itemsObject": {
"oneOf": [
{
"$ref": "#/definitions/refType"
},
{
"allOf": [
{
"$ref": "#/definitions/primitiveType"
},
{
"properties": {
"type": {},
"format": {}
},
"additionalProperties": false
}
]
}
]
}
}
}
\ No newline at end of file
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/dataTypeBase.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Data type fields (section 4.3.3)",
"type": "object",
"oneOf": [
{ "required": [ "type" ] },
{ "required": [ "$ref" ] }
],
"properties": {
"type": { "type": "string" },
"$ref": { "type": "string" },
"format": { "type": "string" },
"defaultValue": {
"not": { "type": [ "array", "object", "null" ] }
},
"enum": {
"type": "array",
"items": { "type": "string" },
"uniqueItems": true,
"minItems": 1
},
"minimum": { "type": "string" },
"maximum": { "type": "string" },
"items": { "$ref": "#/definitions/itemsObject" },
"uniqueItems": { "type": "boolean" }
},
"dependencies": {
"format": {
"oneOf": [
{
"properties": {
"type": { "enum": [ "integer" ] },
"format": { "enum": [ "int32", "int64" ] }
}
},
{
"properties": {
"type": { "enum": [ "number" ] },
"format": { "enum": [ "float", "double" ] }
}
},
{
"properties": {
"type": { "enum": [ "string" ] },
"format": {
"enum": [ "byte", "date", "date-time" ]
}
}
}
]
}
},
"definitions": {
"itemsObject": {
"oneOf": [
{
"type": "object",
"required": [ "$ref" ],
"properties": {
"$ref": { "type": "string" }
},
"additionalProperties": false
},
{
"allOf": [
{ "$ref": "#" },
{
"required": [ "type" ],
"properties": {
"type": {},
"format": {}
},
"additionalProperties": false
}
]
}
]
}
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/infoObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "info object (section 5.1.3)",
"type": "object",
"required": [ "title", "description" ],
"properties": {
"title": { "type": "string" },
"description": { "type": "string" },
"termsOfServiceUrl": { "type": "string", "format": "uri" },
"contact": { "type": "string", "format": "email" },
"license": { "type": "string" },
"licenseUrl": { "type": "string", "format": "uri" }
},
"additionalProperties": false
}
\ No newline at end of file
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/modelsObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"required": [ "id", "properties" ],
"properties": {
"id": { "type": "string" },
"description": { "type": "string" },
"properties": {
"type": "object",
"additionalProperties": { "$ref": "#/definitions/propertyObject" }
},
"subTypes": {
"type": "array",
"items": { "type": "string" },
"uniqueItems": true
},
"discriminator": { "type": "string" }
},
"dependencies": {
"subTypes": [ "discriminator" ]
},
"definitions": {
"propertyObject": {
"allOf": [
{
"not": { "$ref": "#" }
},
{
"$ref": "dataTypeBase.json#"
}
]
}
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/oauth2GrantType.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"minProperties": 1,
"properties": {
"implicit": { "$ref": "#/definitions/implicit" },
"authorization_code": { "$ref": "#/definitions/authorizationCode" }
},
"definitions": {
"implicit": {
"type": "object",
"required": [ "loginEndpoint" ],
"properties": {
"loginEndpoint": { "$ref": "#/definitions/loginEndpoint" },
"tokenName": { "type": "string" }
},
"additionalProperties": false
},
"authorizationCode": {
"type": "object",
"required": [ "tokenEndpoint", "tokenRequestEndpoint" ],
"properties": {
"tokenEndpoint": { "$ref": "#/definitions/tokenEndpoint" },
"tokenRequestEndpoint": { "$ref": "#/definitions/tokenRequestEndpoint" }
},
"additionalProperties": false
},
"loginEndpoint": {
"type": "object",
"required": [ "url" ],
"properties": {
"url": { "type": "string", "format": "uri" }
},
"additionalProperties": false
},
"tokenEndpoint": {
"type": "object",
"required": [ "url" ],
"properties": {
"url": { "type": "string", "format": "uri" },
"tokenName": { "type": "string" }
},
"additionalProperties": false
},
"tokenRequestEndpoint": {
"type": "object",
"required": [ "url" ],
"properties": {
"url": { "type": "string", "format": "uri" },
"clientIdName": { "type": "string" },
"clientSecretName": { "type": "string" }
},
"additionalProperties": false
}
}
}
\ No newline at end of file
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/operationObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"allOf": [
{ "$ref": "dataTypeBase.json#" },
{
"required": [ "method", "nickname", "parameters" ],
"properties": {
"method": { "enum": [ "GET", "HEAD", "POST", "PUT", "PATCH", "DELETE", "OPTIONS" ] },
"summary": { "type": "string", "maxLength": 120 },
"notes": { "type": "string" },
"nickname": {
"type": "string",
"pattern": "^[a-zA-Z0-9_]+$"
},
"authorizations": {
"type": "object",
"additionalProperties": {
"type": "array",
"items": {
"$ref": "authorizationObject.json#/definitions/oauth2Scope"
}
}
},
"parameters": {
"type": "array",
"items": { "$ref": "parameterObject.json#" }
},
"responseMessages": {
"type": "array",
"items": { "$ref": "#/definitions/responseMessageObject"}
},
"produces": { "$ref": "#/definitions/mimeTypeArray" },
"consumes": { "$ref": "#/definitions/mimeTypeArray" },
"deprecated": { "enum": [ "true", "false" ] }
}
}
],
"definitions": {
"responseMessageObject": {
"type": "object",
"required": [ "code", "message" ],
"properties": {
"code": { "$ref": "#/definitions/rfc2616section10" },
"message": { "type": "string" },
"responseModel": { "type": "string" }
}
},
"rfc2616section10": {
"type": "integer",
"minimum": 100,
"maximum": 600,
"exclusiveMaximum": true
},
"mimeTypeArray": {
"type": "array",
"items": {
"type": "string",
"format": "mime-type"
},
"uniqueItems": true
}
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/parameterObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"allOf": [
{ "$ref": "dataTypeBase.json#" },
{
"required": [ "paramType", "name" ],
"properties": {
"paramType": {
"enum": [ "path", "query", "body", "header", "form" ]
},
"name": { "type": "string" },
"description": { "type": "string" },
"required": { "type": "boolean" },
"allowMultiple": { "type": "boolean" }
}
},
{
"description": "type File requires special paramType and consumes",
"oneOf": [
{
"properties": {
"type": { "not": { "enum": [ "File" ] } }
}
},
{
"properties": {
"type": { "enum": [ "File" ] },
"paramType": { "enum": [ "form" ] },
"consumes": { "enum": [ "multipart/form-data" ] }
}
}
]
}
]
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/resourceListing.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"required": [ "swaggerVersion", "apis" ],
"properties": {
"swaggerVersion": { "enum": [ "1.2" ] },
"apis": {
"type": "array",
"items": { "$ref": "resourceObject.json#" }
},
"apiVersion": { "type": "string" },
"info": { "$ref": "infoObject.json#" },
"authorizations": { "$ref": "authorizationObject.json#" }
}
}
{
"id": "https://raw.githubusercontent.com/OAI/OpenAPI-Specification/master/schemas/v1.2/resourceObject.json#",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"required": [ "path" ],
"properties": {
"path": { "type": "string", "format": "uri" },
"description": { "type": "string" }
},
"additionalProperties": false
}
\ No newline at end of file
# OpenAPI Specification v2.0 JSON Schema
This is the JSON Schema file for the OpenAPI Specification version 2.0. Download and install it via NPM.
## Install via NPM
```shell
npm install --save swagger-schema-official
```
## License
Apache-2.0
OpenAPI 3.0.X JSON Schema
---
Here you can find the JSON Schema for validating OpenAPI definitions of versions 3.0.X.
As a reminder, the JSON Schema is not the source of truth for the Specification. In cases of conflicts between the Specification itself and the JSON Schema, the Specification wins. Also, some Specification constraints cannot be represented with the JSON Schema so it's highly recommended to employ other methods to ensure compliance.
The iteration version of the JSON Schema can be found in the `id` field. For example, the value of `id: https://spec.openapis.org/oas/3.0/schema/2019-04-02` means this iteration was created on April 2nd, 2019.
To submit improvements to the schema, modify the schema.yaml file only.
The TSC will then:
- Run tests on the updated schema
- Update the iteration version
- Convert the schema.yaml to schema.json
- Publish the new version
# OpenAPI 3.1.X JSON Schema
Here you can find the JSON Schema for validating OpenAPI definitions of versions
3.1.X.
As a reminder, the JSON Schema is not the source of truth for the Specification.
In cases of conflicts between the Specification itself and the JSON Schema, the
Specification wins. Also, some Specification constraints cannot be represented
with the JSON Schema so it's highly recommended to employ other methods to
ensure compliance.
The iteration version of the JSON Schema can be found in the `$id` field. For
example, the value of `$id: https://spec.openapis.org/oas/3.1/schema/2021-03-02`
means this iteration was created on March 2nd, 2021.
The `schema.yaml` schema doesn't validate the JSON Schemas in your OpenAPI
document because 3.1 allows you to use any JSON Schema dialect you choose. We
have also included `schema-base.yaml` that extends the main schema to validate
that all schemas use the default OAS base vocabulary.
## Contributing
To submit improvements to the schema, modify the schema.yaml file only.
The TSC will then:
- Run tests on the updated schema
- Update the iteration version
- Convert the schema.yaml to schema.json
- Publish the new version
## Tests
The test suite is included as a git submodule of https://github.com/Mermade/openapi3-examples.
```bash
npx mocha --recursive tests
```
You can also validate a document individually.
```bash
scripts/validate.js path/to/document/to/validate.yaml
```
{
"$id": "https://spec.openapis.org/oas/3.1/dialect/base",
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$vocabulary": {
"https://json-schema.org/draft/2020-12/vocab/core": true,
"https://json-schema.org/draft/2020-12/vocab/applicator": true,
"https://json-schema.org/draft/2020-12/vocab/unevaluated": true,
"https://json-schema.org/draft/2020-12/vocab/validation": true,
"https://json-schema.org/draft/2020-12/vocab/meta-data": true,
"https://json-schema.org/draft/2020-12/vocab/format-annotation": true,
"https://json-schema.org/draft/2020-12/vocab/content": true,
"https://spec.openapis.org/oas/3.1/vocab/base": false
},
"$dynamicAnchor": "meta",
"title": "OpenAPI 3.1 Schema Object Dialect",
"allOf": [
{ "$ref": "https://json-schema.org/draft/2020-12/schema" },
{ "$ref": "https://spec.openapis.org/oas/3.1/meta/base" }
]
}
{
"$id": "https://spec.openapis.org/oas/3.1/schema-base/2021-04-15",
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$ref": "https://spec.openapis.org/oas/3.1/schema/2021-04-15",
"properties": {
"jsonSchemaDialect": {
"$ref": "#/$defs/dialect"
}
},
"$defs": {
"dialect": {
"const": "https://spec.openapis.org/oas/3.1/dialect/base"
},
"schema": {
"$dynamicAnchor": "meta",
"$ref\"": "https://spec.openapis.org/oas/3.1/dialect/base",
"properties": {
"$schema": {
"$ref": "#/$defs/dialect"
}
}
}
}
}
$id: 'https://spec.openapis.org/oas/3.1/schema-base/2021-04-15'
$schema: 'https://json-schema.org/draft/2020-12/schema'
$ref: 'https://spec.openapis.org/oas/3.1/schema/2021-04-15'
properties:
jsonSchemaDialect:
$ref: '#/$defs/dialect'
$defs:
dialect:
const: 'https://spec.openapis.org/oas/3.1/dialect/base'
schema:
$dynamicAnchor: meta
$ref": 'https://spec.openapis.org/oas/3.1/dialect/base'
properties:
$schema:
$ref: '#/$defs/dialect'
This diff is collapsed.
The MIT License (MIT)
Copyright (c) 2015 James Messinger
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment