Squashed 'json/' changes from ddef9b0..abdd47f

abdd47f Add a test for large int comparisons.

git-subtree-dir: json
git-subtree-split: abdd47f9dff99a116c8ce09505f2e8f61433dbe8
This commit is contained in:
Julian Berman 2014-06-01 22:37:31 -04:00
parent 161f3078ba
commit ed332a1819
100 changed files with 25 additions and 5676 deletions

25
.gitignore vendored
View File

@ -1,26 +1 @@
.DS_Store
.idea
*.pyc
*.pyo
*.egg-info
_build
build
dist
MANIFEST
.coverage
.coveragerc
coverage
htmlcov
_cache
_static
_templates
_trial_temp
.tox
TODO

View File

@ -1,12 +1,4 @@
language: python
python:
- "pypy"
- "2.6"
- "2.7"
- "3.3"
install:
- python setup.py -q install
script:
- if [[ "$(python -c 'import sys; print sys.version_info[:2]')" == "(2, 6)" ]]; then pip install unittest2; fi
- py.test --tb=native jsonschema
- python -m doctest README.rst
python: "2.7"
install: pip install jsonschema
script: bin/jsonschema_suite check

View File

@ -1,135 +0,0 @@
v2.3.0
------
* Added by_relevance and best_match (#91)
* Fixed ``format`` to allow adding formats for non-strings (#125)
* Fixed the ``uri`` format to reject URI references (#131)
v2.2.0
------
* Compile the host name regex (#127)
* Allow arbitrary objects to be types (#129)
v2.1.0
------
* Support RFC 3339 datetimes in conformance with the spec
* Fixed error paths for additionalItems + items (#122)
* Fixed wording for min / maxProperties (#117)
v2.0.0
------
* Added ``create`` and ``extend`` to ``jsonschema.validators``
* Removed ``ValidatorMixin``
* Fixed array indices ref resolution (#95)
* Fixed unknown scheme defragmenting and handling (#102)
v1.3.0
------
* Better error tracebacks (#83)
* Raise exceptions in ``ErrorTree``\s for keys not in the instance (#92)
* __cause__ (#93)
v1.2.0
------
* More attributes for ValidationError (#86)
* Added ``ValidatorMixin.descend``
* Fixed bad ``RefResolutionError`` message (#82)
v1.1.0
------
* Canonicalize URIs (#70)
* Allow attaching exceptions to ``format`` errors (#77)
v1.0.0
------
* Support for Draft 4
* Support for format
* Longs are ints too!
* Fixed a number of issues with ``$ref`` support (#66)
* Draft4Validator is now the default
* ``ValidationError.path`` is now in sequential order
* Added ``ValidatorMixin``
v0.8.0
------
* Full support for JSON References
* ``validates`` for registering new validators
* Documentation
* Bugfixes
* uniqueItems not so unique (#34)
* Improper any (#47)
v0.7
----
* Partial support for (JSON Pointer) ``$ref``
* Deprecations
* ``Validator`` is replaced by ``Draft3Validator`` with a slightly different
interface
* ``validator(meta_validate=False)``
v0.6
----
* Bugfixes
* Issue #30 - Wrong behavior for the dependencies property validation
* Fix a miswritten test
v0.5
----
* Bugfixes
* Issue #17 - require path for error objects
* Issue #18 - multiple type validation for non-objects
v0.4
----
* Preliminary support for programmatic access to error details (Issue #5).
There are certainly some corner cases that don't do the right thing yet, but
this works mostly.
In order to make this happen (and also to clean things up a bit), a number
of deprecations are necessary:
* ``stop_on_error`` is deprecated in ``Validator.__init__``. Use
``Validator.iter_errors()`` instead.
* ``number_types`` and ``string_types`` are deprecated there as well.
Use ``types={"number" : ..., "string" : ...}`` instead.
* ``meta_validate`` is also deprecated, and instead is now accepted as
an argument to ``validate``, ``iter_errors`` and ``is_valid``.
* A bugfix or two
v0.3
----
* Default for unknown types and properties is now to *not* error (consistent
with the schema).
* Python 3 support
* Removed dependency on SecureTypes now that the hash bug has been resolved.
* "Numerous bug fixes" -- most notably, a divisibleBy error for floats and a
bunch of missing typechecks for irrelevant properties.

19
COPYING
View File

@ -1,19 +0,0 @@
Copyright (c) 2013 Julian Berman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,4 +0,0 @@
include *.rst
include COPYING
include tox.ini
recursive-include json *

View File

@ -1,123 +0,0 @@
==========
jsonschema
==========
``jsonschema`` is an implementation of `JSON Schema <http://json-schema.org>`_
for Python (supporting 2.6+ including Python 3).
.. code-block:: python
>>> from jsonschema import validate
>>> # A sample schema, like what we'd get from json.load()
>>> schema = {
... "type" : "object",
... "properties" : {
... "price" : {"type" : "number"},
... "name" : {"type" : "string"},
... },
... }
>>> # If no exception is raised by validate(), the instance is valid.
>>> validate({"name" : "Eggs", "price" : 34.99}, schema)
>>> validate(
... {"name" : "Eggs", "price" : "Invalid"}, schema
... ) # doctest: +IGNORE_EXCEPTION_DETAIL
Traceback (most recent call last):
...
ValidationError: 'Invalid' is not of type 'number'
Features
--------
* Full support for
`Draft 3 <https://python-jsonschema.readthedocs.org/en/latest/validate/#jsonschema.Draft3Validator>`_
**and** `Draft 4 <https://python-jsonschema.readthedocs.org/en/latest/validate/#jsonschema.Draft4Validator>`_
of the schema.
* `Lazy validation <https://python-jsonschema.readthedocs.org/en/latest/validate/#jsonschema.IValidator.iter_errors>`_
that can iteratively report *all* validation errors.
* Small and extensible
* `Programmatic querying <https://python-jsonschema.readthedocs.org/en/latest/errors/#module-jsonschema>`_
of which properties or items failed validation.
Release Notes
-------------
``v2.3.0`` removes the (improper) limitation of ``format`` to strings. It also
adds the `jsonschema.exceptions.best_match <https://python-jsonschema.readthedocs.org/en/latest/errors/#best-match-and-by-relevance>`_
function which can be used to guess at the best matching single validation
error for a given instance.
.. code-block:: python
>>> from jsonschema.validators import Draft4Validator
>>> from jsonschema.exceptions import best_match
>>> schema = {
... "properties" : {
... "foo" : {"type" : "string"},
... "bar" : {"properties" : {"baz": {"type": "string"}}},
... },
... }
>>> instance = {"foo" : 12, "bar": {"baz" : 19}}
>>> print(best_match(Draft4Validator(schema).iter_errors(instance)).path)
deque(['foo'])
where the error closer to the top of the instance in ``foo`` was selected
as being more relevant.
Also, URI references are now properly rejected by the URI format validator
(i.e., it now only accepts full URIs, as defined in the specification).
Running the Test Suite
----------------------
``jsonschema`` uses the wonderful `Tox <http://tox.readthedocs.org>`_ for its
test suite. (It really is wonderful, if for some reason you haven't heard of
it, you really should use it for your projects).
Assuming you have ``tox`` installed (perhaps via ``pip install tox`` or your
package manager), just run ``tox`` in the directory of your source checkout to
run ``jsonschema``'s test suite on all of the versions of Python ``jsonschema``
supports. Note that you'll need to have all of those versions installed in
order to run the tests on each of them, otherwise ``tox`` will skip (and fail)
the tests on that version.
Of course you're also free to just run the tests on a single version with your
favorite test runner. The tests live in the ``jsonschema.tests`` package.
Community
---------
There's a `mailing list <https://groups.google.com/forum/#!forum/jsonschema>`_
for this implementation on Google Groups.
Please join, and feel free to send questions there.
Contributing
------------
I'm Julian Berman.
``jsonschema`` is on `GitHub <http://github.com/Julian/jsonschema>`_.
Get in touch, via GitHub or otherwise, if you've got something to contribute,
it'd be most welcome!
You can also generally find me on Freenode (nick: ``tos9``) in various
channels, including ``#python``.
If you feel overwhelmingly grateful, you can woo me with beer money on
`Gittip <https://www.gittip.com/Julian/>`_ or via Google Wallet with the email
in my GitHub profile.

View File

@ -1,153 +0,0 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
-rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/jsonschema.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/jsonschema.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/jsonschema"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/jsonschema"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."

View File

@ -1,241 +0,0 @@
# -*- coding: utf-8 -*-
#
# This file is execfile()d with the current directory set to its containing dir.
from textwrap import dedent
import sys, os
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
ext_paths = [os.path.abspath(os.path.pardir), os.path.dirname(__file__)]
sys.path = ext_paths + sys.path
# -- General configuration -----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.coverage',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.viewcode',
'jsonschema_role',
]
cache_path = "_cache"
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'jsonschema'
copyright = u'2013, Julian Berman'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# version: The short X.Y version
# release: The full version, including alpha/beta/rc tags.
from jsonschema import __version__ as release
version = release.partition("-")[0]
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build', "_cache", "_static", "_templates"]
# The reST default role (used for this markup: `text`) to use for all documents.
#default_role = None
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
doctest_global_setup = dedent("""
from __future__ import print_function
from jsonschema import *
""")
intersphinx_mapping = {"python": ("http://docs.python.org/3.2", None)}
# -- Options for HTML output ---------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'pyramid'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'jsonschemadoc'
# -- Options for LaTeX output --------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
('index', 'jsonschema.tex', u'jsonschema Documentation',
u'Julian Berman', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output --------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'jsonschema', u'jsonschema Documentation',
[u'Julian Berman'], 1)
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output ------------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'jsonschema', u'jsonschema Documentation',
u'Julian Berman', 'jsonschema', 'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# -- Read the Docs -------------------------------------------------------------
# Ooo pretty.
RTD_NEW_THEME = True

View File

@ -1,91 +0,0 @@
.. _creating-validators:
================================
Creating or Extending Validators
================================
.. currentmodule:: jsonschema.validators
.. autofunction:: create
Create a new validator (class).
:argument dict meta_schema: the meta schema for the new validator class
:argument dict validators: a mapping from validator names to functions that
validate the given name. Each function should take 4 arguments: a
validator instance, the value of the current validator property in the
instance being validated, the instance, and the schema.
:argument str version: an identifier for the version that this validator
will validate. If provided, the returned validator class will have its
``__name__`` set to include the version, and also will have
:func:`validates` automatically called for the given version.
:argument dict default_types: a default mapping to use for instances of the
validator when mapping between JSON types to Python types. The default
for this argument is probably fine. Instances of the returned validator
can still have their types customized on a per-instance basis.
:returns: a new :class:`jsonschema.IValidator` class
.. autofunction:: extend
Create a new validator that extends an existing validator class.
:argument jsonschema.IValidator validator: an existing validator class
:argument dict validators: a set of new validators to add to the new
validator.
.. note::
Any validators with the same name as an existing one will
(silently) replace the old validator entirely.
If you wish to extend an old validator, call it directly in the
replacing validator function by retrieving it using
``OldValidator.VALIDATORS["the validator"]``.
:argument str version: a version for the new validator
:returns: a new :class:`jsonschema.IValidator` class
.. note:: Meta Schemas
The new validator will just keep the old validator's meta schema.
If you wish to change or extend the meta schema in the new validator,
modify ``META_SCHEMA`` directly on the returned class.
The meta schema on the new validator will not be a copy, so you'll
probably want to copy it before modifying it to not affect the old
validator.
.. autofunction:: validator_for
Retrieve the validator appropriate for validating the given schema.
Uses the :validator:`$schema` property that should be present in the given
schema to look up the appropriate validator.
:argument schema: the schema to look at
:argument default: the default to return if the appropriate validator
cannot be determined. If unprovided, the default will be to just return
:class:`Draft4Validator`
.. autofunction:: validates
Creating Validation Errors
--------------------------
Any validating function that validates against a subschema should call
:meth:`ValidatorMixin.descend`, rather than :meth:`ValidatorMixin.iter_errors`.
If it recurses into the instance, or schema, it should pass one or both of the
``path`` or ``schema_path`` arguments to :meth:`ValidatorMixin.descend` in
order to properly maintain where in the instance or schema respsectively the
error occurred.

View File

@ -1 +0,0 @@
lxml

View File

@ -1,434 +0,0 @@
==========================
Handling Validation Errors
==========================
.. currentmodule:: jsonschema.exceptions
When an invalid instance is encountered, a :exc:`ValidationError` will be
raised or returned, depending on which method or function is used.
.. autoexception:: ValidationError
The instance didn't properly validate under the provided schema.
The information carried by an error roughly breaks down into:
=============== ================= ========================
What Happened Why Did It Happen What Was Being Validated
=============== ================= ========================
:attr:`message` :attr:`context` :attr:`instance`
:attr:`cause` :attr:`path`
:attr:`schema`
:attr:`schema_path`
:attr:`validator`
:attr:`validator_value`
=============== ================= ========================
.. attribute:: message
A human readable message explaining the error.
.. attribute:: validator
The failed `validator
<http://json-schema.org/latest/json-schema-validation.html#anchor12>`_.
.. attribute:: validator_value
The value for the failed validator in the schema.
.. attribute:: schema
The full schema that this error came from. This is potentially a
subschema from within the schema that was passed into the validator, or
even an entirely different schema if a :validator:`$ref` was followed.
.. attribute:: relative_schema_path
A :class:`collections.deque` containing the path to the failed
validator within the schema.
.. attribute:: absolute_schema_path
A :class:`collections.deque` containing the path to the failed
validator within the schema, but always relative to the
*original* schema as opposed to any subschema (i.e. the one
originally passed into a validator, *not* :attr:`schema`\).
.. attribute:: schema_path
Same as :attr:`relative_schema_path`.
.. attribute:: relative_path
A :class:`collections.deque` containing the path to the
offending element within the instance. The deque can be empty if
the error happened at the root of the instance.
.. attribute:: absolute_path
A :class:`collections.deque` containing the path to the
offending element within the instance. The absolute path
is always relative to the *original* instance that was
validated (i.e. the one passed into a validation method, *not*
:attr:`instance`\). The deque can be empty if the error happened
at the root of the instance.
.. attribute:: path
Same as :attr:`relative_path`.
.. attribute:: instance
The instance that was being validated. This will differ from the
instance originally passed into validate if the validator was in the
process of validating a (possibly nested) element within the top-level
instance. The path within the top-level instance (i.e.
:attr:`ValidationError.path`) could be used to find this object, but it
is provided for convenience.
.. attribute:: context
If the error was caused by errors in subschemas, the list of errors
from the subschemas will be available on this property. The
:attr:`.schema_path` and :attr:`.path` of these errors will be relative
to the parent error.
.. attribute:: cause
If the error was caused by a *non*-validation error, the exception
object will be here. Currently this is only used for the exception
raised by a failed format checker in :meth:`FormatChecker.check`.
.. attribute:: parent
A validation error which this error is the :attr:`context` of.
``None`` if there wasn't one.
In case an invalid schema itself is encountered, a :exc:`SchemaError` is
raised.
.. autoexception:: SchemaError
The provided schema is malformed.
The same attributes are present as for :exc:`ValidationError`\s.
These attributes can be clarified with a short example:
.. testcode::
schema = {
"items": {
"anyOf": [
{"type": "string", "maxLength": 2},
{"type": "integer", "minimum": 5}
]
}
}
instance = [{}, 3, "foo"]
v = Draft4Validator(schema)
errors = sorted(v.iter_errors(instance), key=lambda e: e.path)
The error messages in this situation are not very helpful on their own.
.. testcode::
for error in errors:
print(error.message)
outputs:
.. testoutput::
{} is not valid under any of the given schemas
3 is not valid under any of the given schemas
'foo' is not valid under any of the given schemas
If we look at :attr:`~ValidationError.path` on each of the errors, we can find
out which elements in the instance correspond to each of the errors. In
this example, :attr:`~ValidationError.path` will have only one element, which
will be the index in our list.
.. testcode::
for error in errors:
print(list(error.path))
.. testoutput::
[0]
[1]
[2]
Since our schema contained nested subschemas, it can be helpful to look at
the specific part of the instance and subschema that caused each of the errors.
This can be seen with the :attr:`~ValidationError.instance` and
:attr:`~ValidationError.schema` attributes.
With validators like :validator:`anyOf`, the :attr:`~ValidationError.context`
attribute can be used to see the sub-errors which caused the failure. Since
these errors actually came from two separate subschemas, it can be helpful to
look at the :attr:`~ValidationError.schema_path` attribute as well to see where
exactly in the schema each of these errors come from. In the case of sub-errors
from the :attr:`~ValidationError.context` attribute, this path will be relative
to the :attr:`~ValidationError.schema_path` of the parent error.
.. testcode::
for error in errors:
for suberror in sorted(error.context, key=lambda e: e.schema_path):
print(list(suberror.schema_path), suberror.message, sep=", ")
.. testoutput::
[0, 'type'], {} is not of type 'string'
[1, 'type'], {} is not of type 'integer'
[0, 'type'], 3 is not of type 'string'
[1, 'minimum'], 3 is less than the minimum of 5
[0, 'maxLength'], 'foo' is too long
[1, 'type'], 'foo' is not of type 'integer'
The string representation of an error combines some of these attributes for
easier debugging.
.. testcode::
print(errors[1])
.. testoutput::
3 is not valid under any of the given schemas
Failed validating 'anyOf' in schema['items']:
{'anyOf': [{'maxLength': 2, 'type': 'string'},
{'minimum': 5, 'type': 'integer'}]}
On instance[1]:
3
ErrorTrees
----------
If you want to programmatically be able to query which properties or validators
failed when validating a given instance, you probably will want to do so using
:class:`ErrorTree` objects.
.. autoclass:: jsonschema.validators.ErrorTree
:members:
:special-members:
:exclude-members: __dict__,__weakref__
.. attribute:: errors
The mapping of validator names to the error objects (usually
:class:`ValidationError`\s) at this level of the tree.
Consider the following example:
.. testcode::
schema = {
"type" : "array",
"items" : {"type" : "number", "enum" : [1, 2, 3]},
"minItems" : 3,
}
instance = ["spam", 2]
For clarity's sake, the given instance has three errors under this schema:
.. testcode::
v = Draft3Validator(schema)
for error in sorted(v.iter_errors(["spam", 2]), key=str):
print(error.message)
.. testoutput::
'spam' is not of type 'number'
'spam' is not one of [1, 2, 3]
['spam', 2] is too short
Let's construct an :class:`ErrorTree` so that we can query the errors a bit
more easily than by just iterating over the error objects.
.. testcode::
tree = ErrorTree(v.iter_errors(instance))
As you can see, :class:`ErrorTree` takes an iterable of
:class:`ValidationError`\s when constructing a tree so you can directly pass it
the return value of a validator's :attr:`~IValidator.iter_errors` method.
:class:`ErrorTree`\s support a number of useful operations. The first one we
might want to perform is to check whether a given element in our instance
failed validation. We do so using the :keyword:`in` operator:
.. doctest::
>>> 0 in tree
True
>>> 1 in tree
False
The interpretation here is that the 0th index into the instance (``"spam"``)
did have an error (in fact it had 2), while the 1th index (``2``) did not (i.e.
it was valid).
If we want to see which errors a child had, we index into the tree and look at
the :attr:`~ErrorTree.errors` attribute.
.. doctest::
>>> sorted(tree[0].errors)
['enum', 'type']
Here we see that the :validator:`enum` and :validator:`type` validators failed
for index ``0``. In fact :attr:`~ErrorTree.errors` is a dict, whose values are
the :class:`ValidationError`\s, so we can get at those directly if we want
them.
.. doctest::
>>> print(tree[0].errors["type"].message)
'spam' is not of type 'number'
Of course this means that if we want to know if a given validator failed for a
given index, we check for its presence in :attr:`~ErrorTree.errors`:
.. doctest::
>>> "enum" in tree[0].errors
True
>>> "minimum" in tree[0].errors
False
Finally, if you were paying close enough attention, you'll notice that we
haven't seen our :validator:`minItems` error appear anywhere yet. This is
because :validator:`minItems` is an error that applies globally to the instance
itself. So it appears in the root node of the tree.
.. doctest::
>>> "minItems" in tree.errors
True
That's all you need to know to use error trees.
To summarize, each tree contains child trees that can be accessed by indexing
the tree to get the corresponding child tree for a given index into the
instance. Each tree and child has a :attr:`~ErrorTree.errors` attribute, a
dict, that maps the failed validator to the corresponding validation error.
best_match and relevance
------------------------
The :func:`best_match` function is a simple but useful function for attempting
to guess the most relevant error in a given bunch.
.. doctest::
>>> from jsonschema import Draft4Validator
>>> from jsonschema.exceptions import best_match
>>> schema = {
... "type": "array",
... "minItems": 3,
... }
>>> print(best_match(Draft4Validator(schema).iter_errors(11)).message)
11 is not of type 'array'
.. autofunction:: best_match
Try to find an error that appears to be the best match among given errors.
In general, errors that are higher up in the instance (i.e. for which
:attr:`ValidationError.path` is shorter) are considered better matches,
since they indicate "more" is wrong with the instance.
If the resulting match is either :validator:`oneOf` or :validator:`anyOf`,
the *opposite* assumption is made -- i.e. the deepest error is picked,
since these validators only need to match once, and any other errors may
not be relevant.
:argument iterable errors: the errors to select from. Do not provide a
mixture of errors from different validation attempts (i.e. from
different instances or schemas), since it won't produce sensical
output.
:argument callable key: the key to use when sorting errors. See
:attr:`relevance` and transitively :func:`by_relevance` for more
details (the default is to sort with the defaults of that function).
Changing the default is only useful if you want to change the function
that rates errors but still want the error context decension done by
this function.
:returns: the best matching error, or ``None`` if the iterable was empty
.. note::
This function is a heuristic. Its return value may change for a given
set of inputs from version to version if better heuristics are added.
.. function:: relevance(validation_error)
A key function that sorts errors based on heuristic relevance.
If you want to sort a bunch of errors entirely, you can use
this function to do so. Using this function as a key to e.g.
:func:`sorted` or :func:`max` will cause more relevant errors to be
considered greater than less relevant ones.
Within the different validators that can fail, this function
considers :validator:`anyOf` and :validator:`oneOf` to be *weak*
validation errors, and will sort them lower than other validators at
the same level in the instance.
If you want to change the set of weak [or strong] validators you can create
a custom version of this function with :func:`by_relevance` and provide a
different set of each.
.. doctest::
>>> schema = {
... "properties": {
... "name": {"type": "string"},
... "phones": {
... "properties": {
... "home": {"type": "string"}
... },
... },
... },
... }
>>> instance = {"name": 123, "phones": {"home": [123]}}
>>> errors = Draft4Validator(schema).iter_errors(instance)
>>> [
... e.path[-1]
... for e in sorted(errors, key=exceptions.relevance)
... ]
['home', 'name']
.. autofunction:: by_relevance
Create a key function that can be used to sort errors by relevance.
:argument set weak: a collection of validators to consider to be "weak". If
there are two errors at the same level of the instance and one is in
the set of weak validators, the other error will take priority. By
default, :validator:`anyOf` and :validator:`oneOf` are considered weak
validators and will be superceded by other same-level validation
errors.
:argument set strong: a collection of validators to consider to be "strong"

View File

@ -1,103 +0,0 @@
==========================
Frequently Asked Questions
==========================
Why doesn't my schema that has a default property actually set the default on my instance?
------------------------------------------------------------------------------------------
The basic answer is that the specification does not require that
:validator:`default` actually do anything.
For an inkling as to *why* it doesn't actually do anything, consider that none
of the other validators modify the instance either. More importantly, having
:validator:`default` modify the instance can produce quite peculiar things.
It's perfectly valid (and perhaps even useful) to have a default that is not
valid under the schema it lives in! So an instance modified by the default
would pass validation the first time, but fail the second!
Still, filling in defaults is a thing that is useful. :mod:`jsonschema` allows
you to :doc:`define your own validators <creating>`, so you can easily create a
:class:`IValidator` that does do default setting. Here's some code to get you
started:
.. code-block:: python
from jsonschema import Draft4Validator, validators
def extend_with_default(validator_class):
validate_properties = validator_class.VALIDATORS["properties"]
def set_defaults(validator, properties, instance, schema):
for error in validate_properties(
validator, properties, instance, schema,
):
yield error
for property, subschema in properties.iteritems():
if "default" in subschema:
instance.setdefault(property, subschema["default"])
return validators.extend(
validator_class, {"properties" : set_defaults},
)
DefaultValidatingDraft4Validator = extend_with_default(Draft4Validator)
# Example usage:
obj = {}
schema = {'properties': {'foo': {'default': 'bar'}}}
# Note jsonschem.validate(obj, schema, cls=DefaultValidatingDraft4Validator)
# will not work because the metaschema contains `default` directives.
DefaultValidatingDraft4Validator(schema).validate(obj)
assert obj == {'foo': 'bar'}
See the above-linked document for more info on how this works, but basically,
it just extends the :validator:`properties` validator on a
:class:`Draft4Validator` to then go ahead and update all the defaults.
If you're interested in a more interesting solution to a larger class of these
types of transformations, keep an eye on `Seep
<https://github.com/Julian/Seep>`_, which is an experimental data
transformation and extraction library written on top of :mod:`jsonschema`.
How do jsonschema version numbers work?
---------------------------------------
``jsonschema`` tries to follow the `Semantic Versioning <http://semver.org/>`_
specification.
This means broadly that no backwards-incompatible changes should be made in
minor releases (and certainly not in dot releases).
The full picture requires defining what constitutes a backwards-incompatible
change.
The following are simple examples of things considered public API, and
therefore should *not* be changed without bumping a major version number:
* module names and contents, when not marked private by Python convention
(a single leading underscore)
* function and object signature (parameter order and name)
The following are *not* considered public API and may change without notice:
* the exact wording and contents of error messages; typical
reasons to do this seem to involve unit tests. API users are
encouraged to use the extensive introspection provided in
:class:`~jsonschema.exceptions.ValidationError`\s instead to make
meaningful assertions about what failed.
* the order in which validation errors are returned or raised
* anything marked private
With the exception of the last of those, flippant changes are avoided, but
changes can and will be made if there is improvement to be had. Feel free to
open an issue ticket if there is a specific issue or question worth raising.

View File

@ -1,56 +0,0 @@
==========
jsonschema
==========
.. module:: jsonschema
``jsonschema`` is an implementation of `JSON Schema <http://json-schema.org>`_
for Python (supporting 2.6+ including Python 3).
.. code-block:: python
>>> from jsonschema import validate
>>> # A sample schema, like what we'd get from json.load()
>>> schema = {
... "type" : "object",
... "properties" : {
... "price" : {"type" : "number"},
... "name" : {"type" : "string"},
... },
... }
>>> # If no exception is raised by validate(), the instance is valid.
>>> validate({"name" : "Eggs", "price" : 34.99}, schema)
>>> validate(
... {"name" : "Eggs", "price" : "Invalid"}, schema
... ) # doctest: +IGNORE_EXCEPTION_DETAIL
Traceback (most recent call last):
...
ValidationError: 'Invalid' is not of type 'number'
You can find further information (installation instructions, mailing list)
as well as the source code and issue tracker on our
`GitHub page <https://github.com/Julian/jsonschema/>`__.
Contents:
.. toctree::
:maxdepth: 2
validate
errors
references
creating
faq
Indices and tables
==================
* :ref:`genindex`
* :ref:`search`

View File

@ -1,123 +0,0 @@
from datetime import datetime
from docutils import nodes
import errno
import os
try:
import urllib2 as urllib
except ImportError:
import urllib.request as urllib
from lxml import html
VALIDATION_SPEC = "http://json-schema.org/latest/json-schema-validation.html"
def setup(app):
"""
Install the plugin.
:argument sphinx.application.Sphinx app: the Sphinx application context
"""
app.add_config_value("cache_path", "_cache", "")
try:
os.makedirs(app.config.cache_path)
except OSError as error:
if error.errno != errno.EEXIST:
raise
path = os.path.join(app.config.cache_path, "spec.html")
spec = fetch_or_load(path)
app.add_role("validator", docutils_sucks(spec))
def fetch_or_load(spec_path):
"""
Fetch a new specification or use the cache if it's current.
:argument cache_path: the path to a cached specification
"""
headers = {}
try:
modified = datetime.utcfromtimestamp(os.path.getmtime(spec_path))
date = modified.strftime("%a, %d %b %Y %I:%M:%S UTC")
headers["If-Modified-Since"] = date
except OSError as error:
if error.errno != errno.ENOENT:
raise
request = urllib.Request(VALIDATION_SPEC, headers=headers)
response = urllib.urlopen(request)
if response.code == 200:
with open(spec_path, "w+b") as spec:
spec.writelines(response)
spec.seek(0)
return html.parse(spec)
with open(spec_path) as spec:
return html.parse(spec)
def docutils_sucks(spec):
"""
Yeah.
It doesn't allow using a class because it does stupid stuff like try to set
attributes on the callable object rather than just keeping a dict.
"""
base_url = VALIDATION_SPEC
ref_url = "http://json-schema.org/latest/json-schema-core.html#anchor25"
schema_url = "http://json-schema.org/latest/json-schema-core.html#anchor22"
def validator(name, raw_text, text, lineno, inliner):
"""
Link to the JSON Schema documentation for a validator.
:argument str name: the name of the role in the document
:argument str raw_source: the raw text (role with argument)
:argument str text: the argument given to the role
:argument int lineno: the line number
:argument docutils.parsers.rst.states.Inliner inliner: the inliner
:returns: 2-tuple of nodes to insert into the document and an iterable
of system messages, both possibly empty
"""
if text == "$ref":
return [nodes.reference(raw_text, text, refuri=ref_url)], []
elif text == "$schema":
return [nodes.reference(raw_text, text, refuri=schema_url)], []
xpath = "//h3[re:match(text(), '(^|\W)\"?{0}\"?($|\W,)', 'i')]"
header = spec.xpath(
xpath.format(text),
namespaces={"re": "http://exslt.org/regular-expressions"},
)
if len(header) == 0:
inliner.reporter.warning(
"Didn't find a target for {0}".format(text),
)
uri = base_url
else:
if len(header) > 1:
inliner.reporter.info(
"Found multiple targets for {0}".format(text),
)
uri = base_url + "#" + header[0].getprevious().attrib["name"]
reference = nodes.reference(raw_text, text, refuri=uri)
return [reference], []
return validator

View File

@ -1,190 +0,0 @@
@ECHO OFF
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set BUILDDIR=_build
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\jsonschema.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\jsonschema.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
:end

View File

@ -1,13 +0,0 @@
=========================
Resolving JSON References
=========================
.. currentmodule:: jsonschema
.. autoclass:: RefResolver
:members:
.. autoexception:: RefResolutionError
A JSON reference failed to resolve.

View File

@ -1,288 +0,0 @@
=================
Schema Validation
=================
.. currentmodule:: jsonschema
The Basics
----------
The simplest way to validate an instance under a given schema is to use the
:func:`validate` function.
.. autofunction:: validate
.. [#] For information on creating JSON schemas to validate
your data, there is a good introduction to JSON Schema
fundamentals underway at `Understanding JSON Schema
<http://spacetelescope.github.io/understanding-json-schema/>`_
The Validator Interface
-----------------------
:mod:`jsonschema` defines an (informal) interface that all validators should
adhere to.
.. class:: IValidator(schema, types=(), resolver=None, format_checker=None)
:argument dict schema: the schema that the validator will validate with. It
is assumed to be valid, and providing an invalid
schema can lead to undefined behavior. See
:meth:`IValidator.check_schema` to validate a schema
first.
:argument types: Override or extend the list of known types when validating
the :validator:`type` property. Should map strings (type
names) to class objects that will be checked via
:func:`isinstance`. See :ref:`validating-types` for
details.
:type types: dict or iterable of 2-tuples
:argument resolver: an instance of :class:`RefResolver` that will be used
to resolve :validator:`$ref` properties (JSON
references). If unprovided, one will be created.
:argument format_checker: an instance of :class:`FormatChecker` whose
:meth:`~conforms` method will be called to check
and see if instances conform to each
:validator:`format` property present in the
schema. If unprovided, no validation will be done
for :validator:`format`.
.. attribute:: DEFAULT_TYPES
The default mapping of JSON types to Python types used when validating
:validator:`type` properties in JSON schemas.
.. attribute:: META_SCHEMA
An object representing the validator's meta schema (the schema that
describes valid schemas in the given version).
.. attribute:: VALIDATORS
A mapping of validators (:class:`str`\s) to functions that validate the
validator property with that name. For more information see
:ref:`creating-validators`.
.. attribute:: schema
The schema that was passed in when initializing the validator.
.. classmethod:: check_schema(schema)
Validate the given schema against the validator's :attr:`META_SCHEMA`.
:raises: :exc:`SchemaError` if the schema is invalid
.. method:: is_type(instance, type)
Check if the instance is of the given (JSON Schema) type.
:type type: str
:rtype: bool
:raises: :exc:`UnknownType` if ``type`` is not a known type.
.. method:: is_valid(instance)
Check if the instance is valid under the current :attr:`schema`.
:rtype: bool
>>> schema = {"maxItems" : 2}
>>> Draft3Validator(schema).is_valid([2, 3, 4])
False
.. method:: iter_errors(instance)
Lazily yield each of the validation errors in the given instance.
:rtype: an iterable of :exc:`ValidationError`\s
>>> schema = {
... "type" : "array",
... "items" : {"enum" : [1, 2, 3]},
... "maxItems" : 2,
... }
>>> v = Draft3Validator(schema)
>>> for error in sorted(v.iter_errors([2, 3, 4]), key=str):
... print(error.message)
4 is not one of [1, 2, 3]
[2, 3, 4] is too long
.. method:: validate(instance)
Check if the instance is valid under the current :attr:`schema`.
:raises: :exc:`ValidationError` if the instance is invalid
>>> schema = {"maxItems" : 2}
>>> Draft3Validator(schema).validate([2, 3, 4])
Traceback (most recent call last):
...
ValidationError: [2, 3, 4] is too long
All of the :ref:`versioned validators <versioned-validators>` that are included
with :mod:`jsonschema` adhere to the interface, and implementors of validators
that extend or complement the ones included should adhere to it as well. For
more information see :ref:`creating-validators`.
.. _validating-types:
Validating With Additional Types
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Occasionally it can be useful to provide additional or alternate types when
validating the JSON Schema's :validator:`type` property. Validators allow this
by taking a ``types`` argument on construction that specifies additional types,
or which can be used to specify a different set of Python types to map to a
given JSON type.
:mod:`jsonschema` tries to strike a balance between performance in the common
case and generality. For instance, JSON Schema defines a ``number`` type, which
can be validated with a schema such as ``{"type" : "number"}``. By default,
this will accept instances of Python :class:`numbers.Number`. This includes in
particular :class:`int`\s and :class:`float`\s, along with
:class:`decimal.Decimal` objects, :class:`complex` numbers etc. For
``integer`` and ``object``, however, rather than checking for
:class:`numbers.Integral` and :class:`collections.abc.Mapping`,
:mod:`jsonschema` simply checks for :class:`int` and :class:`dict`, since the
more general instance checks can introduce significant slowdown, especially
given how common validating these types are.
If you *do* want the generality, or just want to add a few specific additional
types as being acceptible for a validator, :class:`IValidator`\s have a
``types`` argument that can be used to provide additional or new types.
.. code-block:: python
class MyInteger(object):
...
Draft3Validator(
schema={"type" : "number"},
types={"number" : (numbers.Number, MyInteger)},
)
The list of default Python types for each JSON type is available on each
validator in the :attr:`IValidator.DEFAULT_TYPES` attribute. Note that you
need to specify all types to match if you override one of the existing JSON
types, so you may want to access the set of default types when specifying your
additional type.
.. _versioned-validators:
Versioned Validators
--------------------
:mod:`jsonschema` ships with validators for various versions of the JSON Schema
specification. For details on the methods and attributes that each validator
provides see the :class:`IValidator` interface, which each validator
implements.
.. autoclass:: Draft3Validator
.. autoclass:: Draft4Validator
For example, if you wanted to validate a schema you created against the
Draft 4 meta-schema, you could use:
.. code-block:: python
from jsonschema import Draft4Validator
schema = {
"$schema": "http://json-schema.org/schema#"
"type": "object",
"properties": {
"name": {"type": "string"},
"email": {"type": "string"},
}
"required": ["email"],
}
Draft4Validator.check_schema(schema)
Validating Formats
------------------
JSON Schema defines the :validator:`format` property which can be used to check
if primitive types (``string``\s, ``number``\s, ``boolean``\s) conform to
well-defined formats. By default, no validation is enforced, but optionally,
validation can be enabled by hooking in a format-checking object into an
:class:`IValidator`.
.. doctest::
>>> validate("localhost", {"format" : "hostname"})
>>> validate(
... "-12", {"format" : "hostname"}, format_checker=FormatChecker(),
... )
Traceback (most recent call last):
...
ValidationError: "-12" is not a "hostname"
.. autoclass:: FormatChecker
:members:
:exclude-members: cls_checks
.. attribute:: checkers
A mapping of currently known formats to tuple of functions that
validate them and errors that should be caught. New checkers can be
added and removed either per-instance or globally for all checkers
using the :meth:`FormatChecker.checks` or
:meth:`FormatChecker.cls_checks` decorators respectively.
.. classmethod:: cls_checks(format, raises=())
Register a decorated function as *globally* validating a new format.
Any instance created after this function is called will pick up the
supplied checker.
:argument str format: the format that the decorated function will check
:argument Exception raises: the exception(s) raised by the decorated
function when an invalid instance is found. The exception object
will be accessible as the :attr:`ValidationError.cause` attribute
of the resulting validation error.
There are a number of default checkers that :class:`FormatChecker`\s know how
to validate. Their names can be viewed by inspecting the
:attr:`FormatChecker.checkers` attribute. Certain checkers will only be
available if an appropriate package is available for use. The available
checkers, along with their requirement (if any,) are listed below.
========== ====================
Checker Notes
========== ====================
hostname
ipv4
ipv6 OS must have :func:`socket.inet_pton` function
email
uri requires rfc3987_
date-time requires strict-rfc3339_ [#]_
date
time
regex
color requires webcolors_
========== ====================
.. [#] For backwards compatibility, isodate_ is also supported, but it will
allow any `ISO 8601 <http://en.wikipedia.org/wiki/ISO_8601>`_ date-time,
not just `RFC 3339 <http://www.ietf.org/rfc/rfc3339.txt>`_ as mandated by
the JSON Schema specification.
.. _isodate: http://pypi.python.org/pypi/isodate/
.. _rfc3987: http://pypi.python.org/pypi/rfc3987/
.. _strict-rfc3339: http://pypi.python.org/pypi/strict-rfc3339/
.. _webcolors: http://pypi.python.org/pypi/webcolors/

1
json/.gitignore vendored
View File

@ -1 +0,0 @@
TODO

View File

@ -1,4 +0,0 @@
language: python
python: "2.7"
install: pip install jsonschema
script: bin/jsonschema_suite check

View File

@ -1,26 +0,0 @@
"""
An implementation of JSON Schema for Python
The main functionality is provided by the validator classes for each of the
supported JSON Schema versions.
Most commonly, :func:`validate` is the quickest way to simply validate a given
instance under a schema, and will create a validator for you.
"""
from jsonschema.exceptions import (
ErrorTree, FormatError, RefResolutionError, SchemaError, ValidationError
)
from jsonschema._format import (
FormatChecker, draft3_format_checker, draft4_format_checker,
)
from jsonschema.validators import (
Draft3Validator, Draft4Validator, RefResolver, validate
)
__version__ = "2.3.0"
# flake8: noqa

View File

@ -1,240 +0,0 @@
import datetime
import re
import socket
from jsonschema.compat import str_types
from jsonschema.exceptions import FormatError
class FormatChecker(object):
"""
A ``format`` property checker.
JSON Schema does not mandate that the ``format`` property actually do any
validation. If validation is desired however, instances of this class can
be hooked into validators to enable format validation.
:class:`FormatChecker` objects always return ``True`` when asked about
formats that they do not know how to validate.
To check a custom format using a function that takes an instance and
returns a ``bool``, use the :meth:`FormatChecker.checks` or
:meth:`FormatChecker.cls_checks` decorators.
:argument iterable formats: the known formats to validate. This argument
can be used to limit which formats will be used
during validation.
"""
checkers = {}
def __init__(self, formats=None):
if formats is None:
self.checkers = self.checkers.copy()
else:
self.checkers = dict((k, self.checkers[k]) for k in formats)
def checks(self, format, raises=()):
"""
Register a decorated function as validating a new format.
:argument str format: the format that the decorated function will check
:argument Exception raises: the exception(s) raised by the decorated
function when an invalid instance is found. The exception object
will be accessible as the :attr:`ValidationError.cause` attribute
of the resulting validation error.
"""
def _checks(func):
self.checkers[format] = (func, raises)
return func
return _checks
cls_checks = classmethod(checks)
def check(self, instance, format):
"""
Check whether the instance conforms to the given format.
:argument instance: the instance to check
:type: any primitive type (str, number, bool)
:argument str format: the format that instance should conform to
:raises: :exc:`FormatError` if instance does not conform to format
"""
if format not in self.checkers:
return
func, raises = self.checkers[format]
result, cause = None, None
try:
result = func(instance)
except raises as e:
cause = e
if not result:
raise FormatError(
"%r is not a %r" % (instance, format), cause=cause,
)
def conforms(self, instance, format):
"""
Check whether the instance conforms to the given format.
:argument instance: the instance to check
:type: any primitive type (str, number, bool)
:argument str format: the format that instance should conform to
:rtype: bool
"""
try:
self.check(instance, format)
except FormatError:
return False
else:
return True
_draft_checkers = {"draft3": [], "draft4": []}
def _checks_drafts(both=None, draft3=None, draft4=None, raises=()):
draft3 = draft3 or both
draft4 = draft4 or both
def wrap(func):
if draft3:
_draft_checkers["draft3"].append(draft3)
func = FormatChecker.cls_checks(draft3, raises)(func)
if draft4:
_draft_checkers["draft4"].append(draft4)
func = FormatChecker.cls_checks(draft4, raises)(func)
return func
return wrap
@_checks_drafts("email")
def is_email(instance):
if not isinstance(instance, str_types):
return True
return "@" in instance
_ipv4_re = re.compile(r"^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$")
@_checks_drafts(draft3="ip-address", draft4="ipv4")
def is_ipv4(instance):
if not isinstance(instance, str_types):
return True
if not _ipv4_re.match(instance):
return False
return all(0 <= int(component) <= 255 for component in instance.split("."))
if hasattr(socket, "inet_pton"):
@_checks_drafts("ipv6", raises=socket.error)
def is_ipv6(instance):
if not isinstance(instance, str_types):
return True
return socket.inet_pton(socket.AF_INET6, instance)
_host_name_re = re.compile(r"^[A-Za-z0-9][A-Za-z0-9\.\-]{1,255}$")
@_checks_drafts(draft3="host-name", draft4="hostname")
def is_host_name(instance):
if not isinstance(instance, str_types):
return True
if not _host_name_re.match(instance):
return False
components = instance.split(".")
for component in components:
if len(component) > 63:
return False
return True
try:
import rfc3987
except ImportError:
pass
else:
@_checks_drafts("uri", raises=ValueError)
def is_uri(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="URI")
try:
import strict_rfc3339
except ImportError:
try:
import isodate
except ImportError:
pass
else:
@_checks_drafts("date-time", raises=(ValueError, isodate.ISO8601Error))
def is_date(instance):
if not isinstance(instance, str_types):
return True
return isodate.parse_datetime(instance)
else:
@_checks_drafts("date-time")
def is_date(instance):
if not isinstance(instance, str_types):
return True
return strict_rfc3339.validate_rfc3339(instance)
@_checks_drafts("regex", raises=re.error)
def is_regex(instance):
if not isinstance(instance, str_types):
return True
return re.compile(instance)
@_checks_drafts(draft3="date", raises=ValueError)
def is_date(instance):
if not isinstance(instance, str_types):
return True
return datetime.datetime.strptime(instance, "%Y-%m-%d")
@_checks_drafts(draft3="time", raises=ValueError)
def is_time(instance):
if not isinstance(instance, str_types):
return True
return datetime.datetime.strptime(instance, "%H:%M:%S")
try:
import webcolors
except ImportError:
pass
else:
def is_css_color_code(instance):
return webcolors.normalize_hex(instance)
@_checks_drafts(draft3="color", raises=(ValueError, TypeError))
def is_css21_color(instance):
if (
not isinstance(instance, str_types) or
instance.lower() in webcolors.css21_names_to_hex
):
return True
return is_css_color_code(instance)
def is_css3_color(instance):
if instance.lower() in webcolors.css3_names_to_hex:
return True
return is_css_color_code(instance)
draft3_format_checker = FormatChecker(_draft_checkers["draft3"])
draft4_format_checker = FormatChecker(_draft_checkers["draft4"])

View File

@ -1,216 +0,0 @@
import itertools
import json
import re
import os
from jsonschema.compat import str_types, MutableMapping, urlsplit
class URIDict(MutableMapping):
"""
Dictionary which uses normalized URIs as keys.
"""
def normalize(self, uri):
return urlsplit(uri).geturl()
def __init__(self, *args, **kwargs):
self.store = dict()
self.store.update(*args, **kwargs)
def __getitem__(self, uri):
return self.store[self.normalize(uri)]
def __setitem__(self, uri, value):
self.store[self.normalize(uri)] = value
def __delitem__(self, uri):
del self.store[self.normalize(uri)]
def __iter__(self):
return iter(self.store)
def __len__(self):
return len(self.store)
def __repr__(self):
return repr(self.store)
class Unset(object):
"""
An as-of-yet unset attribute or unprovided default parameter.
"""
def __repr__(self):
return "<unset>"
def load_schema(name):
"""
Load a schema from ./schemas/``name``.json and return it.
"""
schema_dir = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "schemas",
)
with open(os.path.join(schema_dir, name + ".json")) as schema_file:
return json.load(schema_file)
def indent(string, times=1):
"""
A dumb version of :func:`textwrap.indent` from Python 3.3.
"""
return "\n".join(" " * (4 * times) + line for line in string.splitlines())
def format_as_index(indices):
"""
Construct a single string containing indexing operations for the indices.
For example, [1, 2, "foo"] -> [1][2]["foo"]
:type indices: sequence
"""
if not indices:
return ""
return "[%s]" % "][".join(repr(index) for index in indices)
def find_additional_properties(instance, schema):
"""
Return the set of additional properties for the given ``instance``.
Weeds out properties that should have been validated by ``properties`` and
/ or ``patternProperties``.
Assumes ``instance`` is dict-like already.
"""
properties = schema.get("properties", {})
patterns = "|".join(schema.get("patternProperties", {}))
for property in instance:
if property not in properties:
if patterns and re.search(patterns, property):
continue
yield property
def extras_msg(extras):
"""
Create an error message for extra items or properties.
"""
if len(extras) == 1:
verb = "was"
else:
verb = "were"
return ", ".join(repr(extra) for extra in extras), verb
def types_msg(instance, types):
"""
Create an error message for a failure to match the given types.
If the ``instance`` is an object and contains a ``name`` property, it will
be considered to be a description of that object and used as its type.
Otherwise the message is simply the reprs of the given ``types``.
"""
reprs = []
for type in types:
try:
reprs.append(repr(type["name"]))
except Exception:
reprs.append(repr(type))
return "%r is not of type %s" % (instance, ", ".join(reprs))
def flatten(suitable_for_isinstance):
"""
isinstance() can accept a bunch of really annoying different types:
* a single type
* a tuple of types
* an arbitrary nested tree of tuples
Return a flattened tuple of the given argument.
"""
types = set()
if not isinstance(suitable_for_isinstance, tuple):
suitable_for_isinstance = (suitable_for_isinstance,)
for thing in suitable_for_isinstance:
if isinstance(thing, tuple):
types.update(flatten(thing))
else:
types.add(thing)
return tuple(types)
def ensure_list(thing):
"""
Wrap ``thing`` in a list if it's a single str.
Otherwise, return it unchanged.
"""
if isinstance(thing, str_types):
return [thing]
return thing
def unbool(element, true=object(), false=object()):
"""
A hack to make True and 1 and False and 0 unique for ``uniq``.
"""
if element is True:
return true
elif element is False:
return false
return element
def uniq(container):
"""
Check if all of a container's elements are unique.
Successively tries first to rely that the elements are hashable, then
falls back on them being sortable, and finally falls back on brute
force.
"""
try:
return len(set(unbool(i) for i in container)) == len(container)
except TypeError:
try:
sort = sorted(unbool(i) for i in container)
sliced = itertools.islice(sort, 1, None)
for i, j in zip(sort, sliced):
if i == j:
return False
except (NotImplementedError, TypeError):
seen = []
for e in container:
e = unbool(e)
if e in seen:
return False
seen.append(e)
return True

View File

@ -1,358 +0,0 @@
import re
from jsonschema import _utils
from jsonschema.exceptions import FormatError, ValidationError
from jsonschema.compat import iteritems
FLOAT_TOLERANCE = 10 ** -15
def patternProperties(validator, patternProperties, instance, schema):
if not validator.is_type(instance, "object"):
return
for pattern, subschema in iteritems(patternProperties):
for k, v in iteritems(instance):
if re.search(pattern, k):
for error in validator.descend(
v, subschema, path=k, schema_path=pattern,
):
yield error
def additionalProperties(validator, aP, instance, schema):
if not validator.is_type(instance, "object"):
return
extras = set(_utils.find_additional_properties(instance, schema))
if validator.is_type(aP, "object"):
for extra in extras:
for error in validator.descend(instance[extra], aP, path=extra):
yield error
elif not aP and extras:
error = "Additional properties are not allowed (%s %s unexpected)"
yield ValidationError(error % _utils.extras_msg(extras))
def items(validator, items, instance, schema):
if not validator.is_type(instance, "array"):
return
if validator.is_type(items, "object"):
for index, item in enumerate(instance):
for error in validator.descend(item, items, path=index):
yield error
else:
for (index, item), subschema in zip(enumerate(instance), items):
for error in validator.descend(
item, subschema, path=index, schema_path=index,
):
yield error
def additionalItems(validator, aI, instance, schema):
if (
not validator.is_type(instance, "array") or
validator.is_type(schema.get("items", {}), "object")
):
return
len_items = len(schema.get("items", []))
if validator.is_type(aI, "object"):
for index, item in enumerate(instance[len_items:], start=len_items):
for error in validator.descend(item, aI, path=index):
yield error
elif not aI and len(instance) > len(schema.get("items", [])):
error = "Additional items are not allowed (%s %s unexpected)"
yield ValidationError(
error %
_utils.extras_msg(instance[len(schema.get("items", [])):])
)
def minimum(validator, minimum, instance, schema):
if not validator.is_type(instance, "number"):
return
if schema.get("exclusiveMinimum", False):
failed = float(instance) <= minimum
cmp = "less than or equal to"
else:
failed = float(instance) < minimum
cmp = "less than"
if failed:
yield ValidationError(
"%r is %s the minimum of %r" % (instance, cmp, minimum)
)
def maximum(validator, maximum, instance, schema):
if not validator.is_type(instance, "number"):
return
if schema.get("exclusiveMaximum", False):
failed = float(instance) >= maximum
cmp = "greater than or equal to"
else:
failed = float(instance) > maximum
cmp = "greater than"
if failed:
yield ValidationError(
"%r is %s the maximum of %r" % (instance, cmp, maximum)
)
def multipleOf(validator, dB, instance, schema):
if not validator.is_type(instance, "number"):
return
if isinstance(dB, float):
mod = instance % dB
failed = (mod > FLOAT_TOLERANCE) and (dB - mod) > FLOAT_TOLERANCE
else:
failed = instance % dB
if failed:
yield ValidationError("%r is not a multiple of %r" % (instance, dB))
def minItems(validator, mI, instance, schema):
if validator.is_type(instance, "array") and len(instance) < mI:
yield ValidationError("%r is too short" % (instance,))
def maxItems(validator, mI, instance, schema):
if validator.is_type(instance, "array") and len(instance) > mI:
yield ValidationError("%r is too long" % (instance,))
def uniqueItems(validator, uI, instance, schema):
if (
uI and
validator.is_type(instance, "array") and
not _utils.uniq(instance)
):
yield ValidationError("%r has non-unique elements" % instance)
def pattern(validator, patrn, instance, schema):
if (
validator.is_type(instance, "string") and
not re.search(patrn, instance)
):
yield ValidationError("%r does not match %r" % (instance, patrn))
def format(validator, format, instance, schema):
if validator.format_checker is not None:
try:
validator.format_checker.check(instance, format)
except FormatError as error:
yield ValidationError(error.message, cause=error.cause)
def minLength(validator, mL, instance, schema):
if validator.is_type(instance, "string") and len(instance) < mL:
yield ValidationError("%r is too short" % (instance,))
def maxLength(validator, mL, instance, schema):
if validator.is_type(instance, "string") and len(instance) > mL:
yield ValidationError("%r is too long" % (instance,))
def dependencies(validator, dependencies, instance, schema):
if not validator.is_type(instance, "object"):
return
for property, dependency in iteritems(dependencies):
if property not in instance:
continue
if validator.is_type(dependency, "object"):
for error in validator.descend(
instance, dependency, schema_path=property,
):
yield error
else:
dependencies = _utils.ensure_list(dependency)
for dependency in dependencies:
if dependency not in instance:
yield ValidationError(
"%r is a dependency of %r" % (dependency, property)
)
def enum(validator, enums, instance, schema):
if instance not in enums:
yield ValidationError("%r is not one of %r" % (instance, enums))
def ref(validator, ref, instance, schema):
with validator.resolver.resolving(ref) as resolved:
for error in validator.descend(instance, resolved):
yield error
def type_draft3(validator, types, instance, schema):
types = _utils.ensure_list(types)
all_errors = []
for index, type in enumerate(types):
if type == "any":
return
if validator.is_type(type, "object"):
errors = list(validator.descend(instance, type, schema_path=index))
if not errors:
return
all_errors.extend(errors)
else:
if validator.is_type(instance, type):
return
else:
yield ValidationError(
_utils.types_msg(instance, types), context=all_errors,
)
def properties_draft3(validator, properties, instance, schema):
if not validator.is_type(instance, "object"):
return
for property, subschema in iteritems(properties):
if property in instance:
for error in validator.descend(
instance[property],
subschema,
path=property,
schema_path=property,
):
yield error
elif subschema.get("required", False):
error = ValidationError("%r is a required property" % property)
error._set(
validator="required",
validator_value=subschema["required"],
instance=instance,
schema=schema,
)
error.path.appendleft(property)
error.schema_path.extend([property, "required"])
yield error
def disallow_draft3(validator, disallow, instance, schema):
for disallowed in _utils.ensure_list(disallow):
if validator.is_valid(instance, {"type" : [disallowed]}):
yield ValidationError(
"%r is disallowed for %r" % (disallowed, instance)
)
def extends_draft3(validator, extends, instance, schema):
if validator.is_type(extends, "object"):
for error in validator.descend(instance, extends):
yield error
return
for index, subschema in enumerate(extends):
for error in validator.descend(instance, subschema, schema_path=index):
yield error
def type_draft4(validator, types, instance, schema):
types = _utils.ensure_list(types)
if not any(validator.is_type(instance, type) for type in types):
yield ValidationError(_utils.types_msg(instance, types))
def properties_draft4(validator, properties, instance, schema):
if not validator.is_type(instance, "object"):
return
for property, subschema in iteritems(properties):
if property in instance:
for error in validator.descend(
instance[property],
subschema,
path=property,
schema_path=property,
):
yield error
def required_draft4(validator, required, instance, schema):
if not validator.is_type(instance, "object"):
return
for property in required:
if property not in instance:
yield ValidationError("%r is a required property" % property)
def minProperties_draft4(validator, mP, instance, schema):
if validator.is_type(instance, "object") and len(instance) < mP:
yield ValidationError(
"%r does not have enough properties" % (instance,)
)
def maxProperties_draft4(validator, mP, instance, schema):
if not validator.is_type(instance, "object"):
return
if validator.is_type(instance, "object") and len(instance) > mP:
yield ValidationError("%r has too many properties" % (instance,))
def allOf_draft4(validator, allOf, instance, schema):
for index, subschema in enumerate(allOf):
for error in validator.descend(instance, subschema, schema_path=index):
yield error
def oneOf_draft4(validator, oneOf, instance, schema):
subschemas = enumerate(oneOf)
all_errors = []
for index, subschema in subschemas:
errs = list(validator.descend(instance, subschema, schema_path=index))
if not errs:
first_valid = subschema
break
all_errors.extend(errs)
else:
yield ValidationError(
"%r is not valid under any of the given schemas" % (instance,),
context=all_errors,
)
more_valid = [s for i, s in subschemas if validator.is_valid(instance, s)]
if more_valid:
more_valid.append(first_valid)
reprs = ", ".join(repr(schema) for schema in more_valid)
yield ValidationError(
"%r is valid under each of %s" % (instance, reprs)
)
def anyOf_draft4(validator, anyOf, instance, schema):
all_errors = []
for index, subschema in enumerate(anyOf):
errs = list(validator.descend(instance, subschema, schema_path=index))
if not errs:
break
all_errors.extend(errs)
else:
yield ValidationError(
"%r is not valid under any of the given schemas" % (instance,),
context=all_errors,
)
def not_draft4(validator, not_schema, instance, schema):
if validator.is_valid(instance, not_schema):
yield ValidationError(
"%r is not allowed for %r" % (not_schema, instance)
)

View File

@ -1,51 +0,0 @@
from __future__ import unicode_literals
import sys
import operator
try:
from collections import MutableMapping, Sequence # noqa
except ImportError:
from collections.abc import MutableMapping, Sequence # noqa
PY3 = sys.version_info[0] >= 3
if PY3:
zip = zip
from urllib.parse import (
unquote, urljoin, urlunsplit, SplitResult, urlsplit as _urlsplit
)
from urllib.request import urlopen
str_types = str,
int_types = int,
iteritems = operator.methodcaller("items")
else:
from itertools import izip as zip # noqa
from urlparse import (
urljoin, urlunsplit, SplitResult, urlsplit as _urlsplit # noqa
)
from urllib import unquote # noqa
from urllib2 import urlopen # noqa
str_types = basestring
int_types = int, long
iteritems = operator.methodcaller("iteritems")
# On python < 3.3 fragments are not handled properly with unknown schemes
def urlsplit(url):
scheme, netloc, path, query, fragment = _urlsplit(url)
if "#" in path:
path, fragment = path.split("#", 1)
return SplitResult(scheme, netloc, path, query, fragment)
def urldefrag(url):
if "#" in url:
s, n, p, q, frag = urlsplit(url)
defrag = urlunsplit((s, n, p, q, ''))
else:
defrag = url
frag = ''
return defrag, frag
# flake8: noqa

View File

@ -1,263 +0,0 @@
from collections import defaultdict, deque
import itertools
import pprint
import textwrap
from jsonschema import _utils
from jsonschema.compat import PY3, iteritems
WEAK_MATCHES = frozenset(["anyOf", "oneOf"])
STRONG_MATCHES = frozenset()
_unset = _utils.Unset()
class _Error(Exception):
def __init__(
self,
message,
validator=_unset,
path=(),
cause=None,
context=(),
validator_value=_unset,
instance=_unset,
schema=_unset,
schema_path=(),
parent=None,
):
self.message = message
self.path = self.relative_path = deque(path)
self.schema_path = self.relative_schema_path = deque(schema_path)
self.context = list(context)
self.cause = self.__cause__ = cause
self.validator = validator
self.validator_value = validator_value
self.instance = instance
self.schema = schema
self.parent = parent
for error in context:
error.parent = self
def __repr__(self):
return "<%s: %r>" % (self.__class__.__name__, self.message)
def __str__(self):
return unicode(self).encode("utf-8")
def __unicode__(self):
if _unset in (
self.validator, self.validator_value, self.instance, self.schema,
):
return self.message
pschema = pprint.pformat(self.schema, width=72)
pinstance = pprint.pformat(self.instance, width=72)
return self.message + textwrap.dedent("""
Failed validating %r in schema%s:
%s
On instance%s:
%s
""".rstrip()
) % (
self.validator,
_utils.format_as_index(list(self.relative_schema_path)[:-1]),
_utils.indent(pschema),
_utils.format_as_index(self.relative_path),
_utils.indent(pinstance),
)
if PY3:
__str__ = __unicode__
@classmethod
def create_from(cls, other):
return cls(**other._contents())
@property
def absolute_path(self):
parent = self.parent
if parent is None:
return self.relative_path
path = deque(self.relative_path)
path.extendleft(parent.absolute_path)
return path
@property
def absolute_schema_path(self):
parent = self.parent
if parent is None:
return self.relative_schema_path
path = deque(self.relative_schema_path)
path.extendleft(parent.absolute_schema_path)
return path
def _set(self, **kwargs):
for k, v in iteritems(kwargs):
if getattr(self, k) is _unset:
setattr(self, k, v)
def _contents(self):
attrs = (
"message", "cause", "context", "validator", "validator_value",
"path", "schema_path", "instance", "schema", "parent",
)
return dict((attr, getattr(self, attr)) for attr in attrs)
class ValidationError(_Error):
pass
class SchemaError(_Error):
pass
class RefResolutionError(Exception):
pass
class UnknownType(Exception):
def __init__(self, type, instance, schema):
self.type = type
self.instance = instance
self.schema = schema
def __str__(self):
return unicode(self).encode("utf-8")
def __unicode__(self):
pschema = pprint.pformat(self.schema, width=72)
pinstance = pprint.pformat(self.instance, width=72)
return textwrap.dedent("""
Unknown type %r for validator with schema:
%s
While checking instance:
%s
""".rstrip()
) % (self.type, _utils.indent(pschema), _utils.indent(pinstance))
if PY3:
__str__ = __unicode__
class FormatError(Exception):
def __init__(self, message, cause=None):
super(FormatError, self).__init__(message, cause)
self.message = message
self.cause = self.__cause__ = cause
def __str__(self):
return self.message.encode("utf-8")
def __unicode__(self):
return self.message
if PY3:
__str__ = __unicode__
class ErrorTree(object):
"""
ErrorTrees make it easier to check which validations failed.
"""
_instance = _unset
def __init__(self, errors=()):
self.errors = {}
self._contents = defaultdict(self.__class__)
for error in errors:
container = self
for element in error.path:
container = container[element]
container.errors[error.validator] = error
self._instance = error.instance
def __contains__(self, index):
"""
Check whether ``instance[index]`` has any errors.
"""
return index in self._contents
def __getitem__(self, index):
"""
Retrieve the child tree one level down at the given ``index``.
If the index is not in the instance that this tree corresponds to and
is not known by this tree, whatever error would be raised by
``instance.__getitem__`` will be propagated (usually this is some
subclass of :class:`LookupError`.
"""
if self._instance is not _unset and index not in self:
self._instance[index]
return self._contents[index]
def __setitem__(self, index, value):
self._contents[index] = value
def __iter__(self):
"""
Iterate (non-recursively) over the indices in the instance with errors.
"""
return iter(self._contents)
def __len__(self):
"""
Same as :attr:`total_errors`.
"""
return self.total_errors
def __repr__(self):
return "<%s (%s total errors)>" % (self.__class__.__name__, len(self))
@property
def total_errors(self):
"""
The total number of errors in the entire tree, including children.
"""
child_errors = sum(len(tree) for _, tree in iteritems(self._contents))
return len(self.errors) + child_errors
def by_relevance(weak=WEAK_MATCHES, strong=STRONG_MATCHES):
def relevance(error):
validator = error.validator
return -len(error.path), validator not in weak, validator in strong
return relevance
relevance = by_relevance()
def best_match(errors, key=relevance):
errors = iter(errors)
best = next(errors, None)
if best is None:
return
best = max(itertools.chain([best], errors), key=key)
while best.context:
best = min(best.context, key=key)
return best

View File

@ -1,201 +0,0 @@
{
"$schema": "http://json-schema.org/draft-03/schema#",
"dependencies": {
"exclusiveMaximum": "maximum",
"exclusiveMinimum": "minimum"
},
"id": "http://json-schema.org/draft-03/schema#",
"properties": {
"$ref": {
"format": "uri",
"type": "string"
},
"$schema": {
"format": "uri",
"type": "string"
},
"additionalItems": {
"default": {},
"type": [
{
"$ref": "#"
},
"boolean"
]
},
"additionalProperties": {
"default": {},
"type": [
{
"$ref": "#"
},
"boolean"
]
},
"default": {
"type": "any"
},
"dependencies": {
"additionalProperties": {
"items": {
"type": "string"
},
"type": [
"string",
"array",
{
"$ref": "#"
}
]
},
"default": {},
"type": [
"string",
"array",
"object"
]
},
"description": {
"type": "string"
},
"disallow": {
"items": {
"type": [
"string",
{
"$ref": "#"
}
]
},
"type": [
"string",
"array"
],
"uniqueItems": true
},
"divisibleBy": {
"default": 1,
"exclusiveMinimum": true,
"minimum": 0,
"type": "number"
},
"enum": {
"minItems": 1,
"type": "array",
"uniqueItems": true
},
"exclusiveMaximum": {
"default": false,
"type": "boolean"
},
"exclusiveMinimum": {
"default": false,
"type": "boolean"
},
"extends": {
"default": {},
"items": {
"$ref": "#"
},
"type": [
{
"$ref": "#"
},
"array"
]
},
"format": {
"type": "string"
},
"id": {
"format": "uri",
"type": "string"
},
"items": {
"default": {},
"items": {
"$ref": "#"
},
"type": [
{
"$ref": "#"
},
"array"
]
},
"maxDecimal": {
"minimum": 0,
"type": "number"
},
"maxItems": {
"minimum": 0,
"type": "integer"
},
"maxLength": {
"type": "integer"
},
"maximum": {
"type": "number"
},
"minItems": {
"default": 0,
"minimum": 0,
"type": "integer"
},
"minLength": {
"default": 0,
"minimum": 0,
"type": "integer"
},
"minimum": {
"type": "number"
},
"pattern": {
"format": "regex",
"type": "string"
},
"patternProperties": {
"additionalProperties": {
"$ref": "#"
},
"default": {},
"type": "object"
},
"properties": {
"additionalProperties": {
"$ref": "#",
"type": "object"
},
"default": {},
"type": "object"
},
"required": {
"default": false,
"type": "boolean"
},
"title": {
"type": "string"
},
"type": {
"default": "any",
"items": {
"type": [
"string",
{
"$ref": "#"
}
]
},
"type": [
"string",
"array"
],
"uniqueItems": true
},
"uniqueItems": {
"default": false,
"type": "boolean"
}
},
"type": "object"
}

View File

@ -1,221 +0,0 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"default": {},
"definitions": {
"positiveInteger": {
"minimum": 0,
"type": "integer"
},
"positiveIntegerDefault0": {
"allOf": [
{
"$ref": "#/definitions/positiveInteger"
},
{
"default": 0
}
]
},
"schemaArray": {
"items": {
"$ref": "#"
},
"minItems": 1,
"type": "array"
},
"simpleTypes": {
"enum": [
"array",
"boolean",
"integer",
"null",
"number",
"object",
"string"
]
},
"stringArray": {
"items": {
"type": "string"
},
"minItems": 1,
"type": "array",
"uniqueItems": true
}
},
"dependencies": {
"exclusiveMaximum": [
"maximum"
],
"exclusiveMinimum": [
"minimum"
]
},
"description": "Core schema meta-schema",
"id": "http://json-schema.org/draft-04/schema#",
"properties": {
"$schema": {
"format": "uri",
"type": "string"
},
"additionalItems": {
"anyOf": [
{
"type": "boolean"
},
{
"$ref": "#"
}
],
"default": {}
},
"additionalProperties": {
"anyOf": [
{
"type": "boolean"
},
{
"$ref": "#"
}
],
"default": {}
},
"allOf": {
"$ref": "#/definitions/schemaArray"
},
"anyOf": {
"$ref": "#/definitions/schemaArray"
},
"default": {},
"definitions": {
"additionalProperties": {
"$ref": "#"
},
"default": {},
"type": "object"
},
"dependencies": {
"additionalProperties": {
"anyOf": [
{
"$ref": "#"
},
{
"$ref": "#/definitions/stringArray"
}
]
},
"type": "object"
},
"description": {
"type": "string"
},
"enum": {
"minItems": 1,
"type": "array",
"uniqueItems": true
},
"exclusiveMaximum": {
"default": false,
"type": "boolean"
},
"exclusiveMinimum": {
"default": false,
"type": "boolean"
},
"id": {
"format": "uri",
"type": "string"
},
"items": {
"anyOf": [
{
"$ref": "#"
},
{
"$ref": "#/definitions/schemaArray"
}
],
"default": {}
},
"maxItems": {
"$ref": "#/definitions/positiveInteger"
},
"maxLength": {
"$ref": "#/definitions/positiveInteger"
},
"maxProperties": {
"$ref": "#/definitions/positiveInteger"
},
"maximum": {
"type": "number"
},
"minItems": {
"$ref": "#/definitions/positiveIntegerDefault0"
},
"minLength": {
"$ref": "#/definitions/positiveIntegerDefault0"
},
"minProperties": {
"$ref": "#/definitions/positiveIntegerDefault0"
},
"minimum": {
"type": "number"
},
"multipleOf": {
"exclusiveMinimum": true,
"minimum": 0,
"type": "number"
},
"not": {
"$ref": "#"
},
"oneOf": {
"$ref": "#/definitions/schemaArray"
},
"pattern": {
"format": "regex",
"type": "string"
},
"patternProperties": {
"additionalProperties": {
"$ref": "#"
},
"default": {},
"type": "object"
},
"properties": {
"additionalProperties": {
"$ref": "#"
},
"default": {},
"type": "object"
},
"required": {
"$ref": "#/definitions/stringArray"
},
"title": {
"type": "string"
},
"type": {
"anyOf": [
{
"$ref": "#/definitions/simpleTypes"
},
{
"items": {
"$ref": "#/definitions/simpleTypes"
},
"minItems": 1,
"type": "array",
"uniqueItems": true
}
]
},
"uniqueItems": {
"default": false,
"type": "boolean"
}
},
"type": "object"
}

View File

@ -1,15 +0,0 @@
import sys
if sys.version_info[:2] < (2, 7): # pragma: no cover
import unittest2 as unittest
else:
import unittest
try:
from unittest import mock
except ImportError:
import mock
# flake8: noqa

View File

@ -1,270 +0,0 @@
from jsonschema import Draft4Validator, exceptions
from jsonschema.tests.compat import mock, unittest
class TestBestMatch(unittest.TestCase):
def best_match(self, errors):
errors = list(errors)
best = exceptions.best_match(errors)
reversed_best = exceptions.best_match(reversed(errors))
self.assertEqual(
best,
reversed_best,
msg="Didn't return a consistent best match!\n"
"Got: {0}\n\nThen: {1}".format(best, reversed_best),
)
return best
def test_shallower_errors_are_better_matches(self):
validator = Draft4Validator(
{
"properties" : {
"foo" : {
"minProperties" : 2,
"properties" : {"bar" : {"type" : "object"}},
}
}
}
)
best = self.best_match(validator.iter_errors({"foo" : {"bar" : []}}))
self.assertEqual(best.validator, "minProperties")
def test_oneOf_and_anyOf_are_weak_matches(self):
"""
A property you *must* match is probably better than one you have to
match a part of.
"""
validator = Draft4Validator(
{
"minProperties" : 2,
"anyOf" : [{"type" : "string"}, {"type" : "number"}],
"oneOf" : [{"type" : "string"}, {"type" : "number"}],
}
)
best = self.best_match(validator.iter_errors({}))
self.assertEqual(best.validator, "minProperties")
def test_if_the_most_relevant_error_is_anyOf_it_is_traversed(self):
"""
If the most relevant error is an anyOf, then we traverse its context
and select the otherwise *least* relevant error, since in this case
that means the most specific, deep, error inside the instance.
I.e. since only one of the schemas must match, we look for the most
relevant one.
"""
validator = Draft4Validator(
{
"properties" : {
"foo" : {
"anyOf" : [
{"type" : "string"},
{"properties" : {"bar" : {"type" : "array"}}},
],
},
},
},
)
best = self.best_match(validator.iter_errors({"foo" : {"bar" : 12}}))
self.assertEqual(best.validator_value, "array")
def test_if_the_most_relevant_error_is_oneOf_it_is_traversed(self):
"""
If the most relevant error is an oneOf, then we traverse its context
and select the otherwise *least* relevant error, since in this case
that means the most specific, deep, error inside the instance.
I.e. since only one of the schemas must match, we look for the most
relevant one.
"""
validator = Draft4Validator(
{
"properties" : {
"foo" : {
"oneOf" : [
{"type" : "string"},
{"properties" : {"bar" : {"type" : "array"}}},
],
},
},
},
)
best = self.best_match(validator.iter_errors({"foo" : {"bar" : 12}}))
self.assertEqual(best.validator_value, "array")
def test_if_the_most_relevant_error_is_allOf_it_is_traversed(self):
"""
Now, if the error is allOf, we traverse but select the *most* relevant
error from the context, because all schemas here must match anyways.
"""
validator = Draft4Validator(
{
"properties" : {
"foo" : {
"allOf" : [
{"type" : "string"},
{"properties" : {"bar" : {"type" : "array"}}},
],
},
},
},
)
best = self.best_match(validator.iter_errors({"foo" : {"bar" : 12}}))
self.assertEqual(best.validator_value, "string")
def test_nested_context_for_oneOf(self):
validator = Draft4Validator(
{
"properties" : {
"foo" : {
"oneOf" : [
{"type" : "string"},
{
"oneOf" : [
{"type" : "string"},
{
"properties" : {
"bar" : {"type" : "array"}
},
},
],
},
],
},
},
},
)
best = self.best_match(validator.iter_errors({"foo" : {"bar" : 12}}))
self.assertEqual(best.validator_value, "array")
def test_one_error(self):
validator = Draft4Validator({"minProperties" : 2})
error, = validator.iter_errors({})
self.assertEqual(
exceptions.best_match(validator.iter_errors({})).validator,
"minProperties",
)
def test_no_errors(self):
validator = Draft4Validator({})
self.assertIsNone(exceptions.best_match(validator.iter_errors({})))
class TestByRelevance(unittest.TestCase):
def test_short_paths_are_better_matches(self):
shallow = exceptions.ValidationError("Oh no!", path=["baz"])
deep = exceptions.ValidationError("Oh yes!", path=["foo", "bar"])
match = max([shallow, deep], key=exceptions.relevance)
self.assertIs(match, shallow)
match = max([deep, shallow], key=exceptions.relevance)
self.assertIs(match, shallow)
def test_global_errors_are_even_better_matches(self):
shallow = exceptions.ValidationError("Oh no!", path=[])
deep = exceptions.ValidationError("Oh yes!", path=["foo"])
errors = sorted([shallow, deep], key=exceptions.relevance)
self.assertEqual(
[list(error.path) for error in errors],
[["foo"], []],
)
errors = sorted([deep, shallow], key=exceptions.relevance)
self.assertEqual(
[list(error.path) for error in errors],
[["foo"], []],
)
def test_weak_validators_are_lower_priority(self):
weak = exceptions.ValidationError("Oh no!", path=[], validator="a")
normal = exceptions.ValidationError("Oh yes!", path=[], validator="b")
best_match = exceptions.by_relevance(weak="a")
match = max([weak, normal], key=best_match)
self.assertIs(match, normal)
match = max([normal, weak], key=best_match)
self.assertIs(match, normal)
def test_strong_validators_are_higher_priority(self):
weak = exceptions.ValidationError("Oh no!", path=[], validator="a")
normal = exceptions.ValidationError("Oh yes!", path=[], validator="b")
strong = exceptions.ValidationError("Oh fine!", path=[], validator="c")
best_match = exceptions.by_relevance(weak="a", strong="c")
match = max([weak, normal, strong], key=best_match)
self.assertIs(match, strong)
match = max([strong, normal, weak], key=best_match)
self.assertIs(match, strong)
class TestErrorTree(unittest.TestCase):
def test_it_knows_how_many_total_errors_it_contains(self):
errors = [mock.MagicMock() for _ in range(8)]
tree = exceptions.ErrorTree(errors)
self.assertEqual(tree.total_errors, 8)
def test_it_contains_an_item_if_the_item_had_an_error(self):
errors = [exceptions.ValidationError("a message", path=["bar"])]
tree = exceptions.ErrorTree(errors)
self.assertIn("bar", tree)
def test_it_does_not_contain_an_item_if_the_item_had_no_error(self):
errors = [exceptions.ValidationError("a message", path=["bar"])]
tree = exceptions.ErrorTree(errors)
self.assertNotIn("foo", tree)
def test_validators_that_failed_appear_in_errors_dict(self):
error = exceptions.ValidationError("a message", validator="foo")
tree = exceptions.ErrorTree([error])
self.assertEqual(tree.errors, {"foo" : error})
def test_it_creates_a_child_tree_for_each_nested_path(self):
errors = [
exceptions.ValidationError("a bar message", path=["bar"]),
exceptions.ValidationError("a bar -> 0 message", path=["bar", 0]),
]
tree = exceptions.ErrorTree(errors)
self.assertIn(0, tree["bar"])
self.assertNotIn(1, tree["bar"])
def test_children_have_their_errors_dicts_built(self):
e1, e2 = (
exceptions.ValidationError("1", validator="foo", path=["bar", 0]),
exceptions.ValidationError("2", validator="quux", path=["bar", 0]),
)
tree = exceptions.ErrorTree([e1, e2])
self.assertEqual(tree["bar"][0].errors, {"foo" : e1, "quux" : e2})
def test_it_does_not_contain_subtrees_that_are_not_in_the_instance(self):
error = exceptions.ValidationError("123", validator="foo", instance=[])
tree = exceptions.ErrorTree([error])
with self.assertRaises(IndexError):
tree[0]
def test_if_its_in_the_tree_anyhow_it_does_not_raise_an_error(self):
"""
If a validator is dumb (like :validator:`required` in draft 3) and
refers to a path that isn't in the instance, the tree still properly
returns a subtree for that path.
"""
error = exceptions.ValidationError(
"a message", validator="foo", instance={}, path=["foo"],
)
tree = exceptions.ErrorTree([error])
self.assertIsInstance(tree["foo"], exceptions.ErrorTree)

View File

@ -1,63 +0,0 @@
"""
Tests for the parts of jsonschema related to the :validator:`format` property.
"""
from jsonschema.tests.compat import mock, unittest
from jsonschema import FormatError, ValidationError, FormatChecker
from jsonschema.validators import Draft4Validator
class TestFormatChecker(unittest.TestCase):
def setUp(self):
self.fn = mock.Mock()
def test_it_can_validate_no_formats(self):
checker = FormatChecker(formats=())
self.assertFalse(checker.checkers)
def test_it_raises_a_key_error_for_unknown_formats(self):
with self.assertRaises(KeyError):
FormatChecker(formats=["o noes"])
def test_it_can_register_cls_checkers(self):
with mock.patch.dict(FormatChecker.checkers, clear=True):
FormatChecker.cls_checks("new")(self.fn)
self.assertEqual(FormatChecker.checkers, {"new" : (self.fn, ())})
def test_it_can_register_checkers(self):
checker = FormatChecker()
checker.checks("new")(self.fn)
self.assertEqual(
checker.checkers,
dict(FormatChecker.checkers, new=(self.fn, ()))
)
def test_it_catches_registered_errors(self):
checker = FormatChecker()
cause = self.fn.side_effect = ValueError()
checker.checks("foo", raises=ValueError)(self.fn)
with self.assertRaises(FormatError) as cm:
checker.check("bar", "foo")
self.assertIs(cm.exception.cause, cause)
self.assertIs(cm.exception.__cause__, cause)
# Unregistered errors should not be caught
self.fn.side_effect = AttributeError
with self.assertRaises(AttributeError):
checker.check("bar", "foo")
def test_format_error_causes_become_validation_error_causes(self):
checker = FormatChecker()
checker.checks("foo", raises=ValueError)(self.fn)
cause = self.fn.side_effect = ValueError()
validator = Draft4Validator({"format" : "foo"}, format_checker=checker)
with self.assertRaises(ValidationError) as cm:
validator.validate("bar")
self.assertIs(cm.exception.__cause__, cause)

View File

@ -1,272 +0,0 @@
"""
Test runner for the JSON Schema official test suite
Tests comprehensive correctness of each draft's validator.
See https://github.com/json-schema/JSON-Schema-Test-Suite for details.
"""
from contextlib import closing
from decimal import Decimal
import glob
import json
import io
import itertools
import os
import re
import subprocess
try:
from sys import pypy_version_info
except ImportError:
pypy_version_info = None
from jsonschema import (
FormatError, SchemaError, ValidationError, Draft3Validator,
Draft4Validator, FormatChecker, draft3_format_checker,
draft4_format_checker, validate,
)
from jsonschema.compat import PY3
from jsonschema.tests.compat import mock, unittest
import jsonschema
REPO_ROOT = os.path.join(os.path.dirname(jsonschema.__file__), os.path.pardir)
SUITE = os.getenv("JSON_SCHEMA_TEST_SUITE", os.path.join(REPO_ROOT, "json"))
if not os.path.isdir(SUITE):
raise ValueError(
"Can't find the JSON-Schema-Test-Suite directory. Set the "
"'JSON_SCHEMA_TEST_SUITE' environment variable or run the tests from "
"alongside a checkout of the suite."
)
TESTS_DIR = os.path.join(SUITE, "tests")
JSONSCHEMA_SUITE = os.path.join(SUITE, "bin", "jsonschema_suite")
remotes_stdout = subprocess.Popen(
["python", JSONSCHEMA_SUITE, "remotes"], stdout=subprocess.PIPE,
).stdout
with closing(remotes_stdout):
if PY3:
remotes_stdout = io.TextIOWrapper(remotes_stdout)
REMOTES = json.load(remotes_stdout)
def make_case(schema, data, valid, name):
if valid:
def test_case(self):
kwargs = getattr(self, "validator_kwargs", {})
validate(data, schema, cls=self.validator_class, **kwargs)
else:
def test_case(self):
kwargs = getattr(self, "validator_kwargs", {})
with self.assertRaises(ValidationError):
validate(data, schema, cls=self.validator_class, **kwargs)
if not PY3:
name = name.encode("utf-8")
test_case.__name__ = name
return test_case
def maybe_skip(skip, test, case):
if skip is not None:
reason = skip(case)
if reason is not None:
test = unittest.skip(reason)(test)
return test
def load_json_cases(tests_glob, ignore_glob="", basedir=TESTS_DIR, skip=None):
if ignore_glob:
ignore_glob = os.path.join(basedir, ignore_glob)
def add_test_methods(test_class):
ignored = set(glob.iglob(ignore_glob))
for filename in glob.iglob(os.path.join(basedir, tests_glob)):
if filename in ignored:
continue
validating, _ = os.path.splitext(os.path.basename(filename))
id = itertools.count(1)
with open(filename) as test_file:
for case in json.load(test_file):
for test in case["tests"]:
name = "test_%s_%s_%s" % (
validating,
next(id),
re.sub(r"[\W ]+", "_", test["description"]),
)
assert not hasattr(test_class, name), name
test_case = make_case(
data=test["data"],
schema=case["schema"],
valid=test["valid"],
name=name,
)
test_case = maybe_skip(skip, test_case, case)
setattr(test_class, name, test_case)
return test_class
return add_test_methods
class TypesMixin(object):
@unittest.skipIf(PY3, "In Python 3 json.load always produces unicode")
def test_string_a_bytestring_is_a_string(self):
self.validator_class({"type" : "string"}).validate(b"foo")
class DecimalMixin(object):
def test_it_can_validate_with_decimals(self):
schema = {"type" : "number"}
validator = self.validator_class(
schema, types={"number" : (int, float, Decimal)}
)
for valid in [1, 1.1, Decimal(1) / Decimal(8)]:
validator.validate(valid)
for invalid in ["foo", {}, [], True, None]:
with self.assertRaises(ValidationError):
validator.validate(invalid)
def missing_format(checker):
def missing_format(case):
format = case["schema"].get("format")
if format not in checker.checkers:
return "Format checker {0!r} not found.".format(format)
elif (
format == "date-time" and
pypy_version_info is not None and
pypy_version_info[:2] <= (1, 9)
):
# datetime.datetime is overzealous about typechecking in <=1.9
return "datetime.datetime is broken on this version of PyPy."
return missing_format
class FormatMixin(object):
def test_it_returns_true_for_formats_it_does_not_know_about(self):
validator = self.validator_class(
{"format" : "carrot"}, format_checker=FormatChecker(),
)
validator.validate("bugs")
def test_it_does_not_validate_formats_by_default(self):
validator = self.validator_class({})
self.assertIsNone(validator.format_checker)
def test_it_validates_formats_if_a_checker_is_provided(self):
checker = mock.Mock(spec=FormatChecker)
validator = self.validator_class(
{"format" : "foo"}, format_checker=checker,
)
validator.validate("bar")
checker.check.assert_called_once_with("bar", "foo")
cause = ValueError()
checker.check.side_effect = FormatError('aoeu', cause=cause)
with self.assertRaises(ValidationError) as cm:
validator.validate("bar")
# Make sure original cause is attached
self.assertIs(cm.exception.cause, cause)
def test_it_validates_formats_of_any_type(self):
checker = mock.Mock(spec=FormatChecker)
validator = self.validator_class(
{"format" : "foo"}, format_checker=checker,
)
validator.validate([1, 2, 3])
checker.check.assert_called_once_with([1, 2, 3], "foo")
cause = ValueError()
checker.check.side_effect = FormatError('aoeu', cause=cause)
with self.assertRaises(ValidationError) as cm:
validator.validate([1, 2, 3])
# Make sure original cause is attached
self.assertIs(cm.exception.cause, cause)
@load_json_cases("draft3/*.json", ignore_glob="draft3/refRemote.json")
@load_json_cases(
"draft3/optional/format.json", skip=missing_format(draft3_format_checker)
)
@load_json_cases("draft3/optional/bignum.json")
@load_json_cases("draft3/optional/zeroTerminatedFloats.json")
class TestDraft3(unittest.TestCase, TypesMixin, DecimalMixin, FormatMixin):
validator_class = Draft3Validator
validator_kwargs = {"format_checker" : draft3_format_checker}
def test_any_type_is_valid_for_type_any(self):
validator = self.validator_class({"type" : "any"})
validator.validate(mock.Mock())
# TODO: we're in need of more meta schema tests
def test_invalid_properties(self):
with self.assertRaises(SchemaError):
validate({}, {"properties": {"test": True}},
cls=self.validator_class)
def test_minItems_invalid_string(self):
with self.assertRaises(SchemaError):
# needs to be an integer
validate([1], {"minItems" : "1"}, cls=self.validator_class)
@load_json_cases("draft4/*.json", ignore_glob="draft4/refRemote.json")
@load_json_cases(
"draft4/optional/format.json", skip=missing_format(draft4_format_checker)
)
@load_json_cases("draft4/optional/bignum.json")
@load_json_cases("draft4/optional/zeroTerminatedFloats.json")
class TestDraft4(unittest.TestCase, TypesMixin, DecimalMixin, FormatMixin):
validator_class = Draft4Validator
validator_kwargs = {"format_checker" : draft4_format_checker}
# TODO: we're in need of more meta schema tests
def test_invalid_properties(self):
with self.assertRaises(SchemaError):
validate({}, {"properties": {"test": True}},
cls=self.validator_class)
def test_minItems_invalid_string(self):
with self.assertRaises(SchemaError):
# needs to be an integer
validate([1], {"minItems" : "1"}, cls=self.validator_class)
class RemoteRefResolutionMixin(object):
def setUp(self):
patch = mock.patch("jsonschema.validators.requests")
requests = patch.start()
requests.get.side_effect = self.resolve
self.addCleanup(patch.stop)
def resolve(self, reference):
_, _, reference = reference.partition("http://localhost:1234/")
return mock.Mock(**{"json.return_value" : REMOTES.get(reference)})
@load_json_cases("draft3/refRemote.json")
class Draft3RemoteResolution(RemoteRefResolutionMixin, unittest.TestCase):
validator_class = Draft3Validator
@load_json_cases("draft4/refRemote.json")
class Draft4RemoteResolution(RemoteRefResolutionMixin, unittest.TestCase):
validator_class = Draft4Validator

View File

@ -1,878 +0,0 @@
from collections import deque
from contextlib import contextmanager
import json
import textwrap
from jsonschema import FormatChecker, ValidationError
from jsonschema.compat import PY3
from jsonschema.tests.compat import mock, unittest
from jsonschema.validators import (
RefResolutionError, UnknownType, Draft3Validator,
Draft4Validator, RefResolver, create, extend, validator_for, validate,
)
class TestCreateAndExtend(unittest.TestCase):
def setUp(self):
self.meta_schema = {u"properties" : {u"smelly" : {}}}
self.smelly = mock.MagicMock()
self.validators = {u"smelly" : self.smelly}
self.types = {u"dict" : dict}
self.Validator = create(
meta_schema=self.meta_schema,
validators=self.validators,
default_types=self.types,
)
self.validator_value = 12
self.schema = {u"smelly" : self.validator_value}
self.validator = self.Validator(self.schema)
def test_attrs(self):
self.assertEqual(self.Validator.VALIDATORS, self.validators)
self.assertEqual(self.Validator.META_SCHEMA, self.meta_schema)
self.assertEqual(self.Validator.DEFAULT_TYPES, self.types)
def test_init(self):
self.assertEqual(self.validator.schema, self.schema)
def test_iter_errors(self):
instance = "hello"
self.smelly.return_value = []
self.assertEqual(list(self.validator.iter_errors(instance)), [])
error = mock.Mock()
self.smelly.return_value = [error]
self.assertEqual(list(self.validator.iter_errors(instance)), [error])
self.smelly.assert_called_with(
self.validator, self.validator_value, instance, self.schema,
)
def test_if_a_version_is_provided_it_is_registered(self):
with mock.patch("jsonschema.validators.validates") as validates:
validates.side_effect = lambda version : lambda cls : cls
Validator = create(meta_schema={u"id" : ""}, version="my version")
validates.assert_called_once_with("my version")
self.assertEqual(Validator.__name__, "MyVersionValidator")
def test_if_a_version_is_not_provided_it_is_not_registered(self):
with mock.patch("jsonschema.validators.validates") as validates:
create(meta_schema={u"id" : "id"})
self.assertFalse(validates.called)
def test_extend(self):
validators = dict(self.Validator.VALIDATORS)
new = mock.Mock()
Extended = extend(self.Validator, validators={u"a new one" : new})
validators.update([(u"a new one", new)])
self.assertEqual(Extended.VALIDATORS, validators)
self.assertNotIn(u"a new one", self.Validator.VALIDATORS)
self.assertEqual(Extended.META_SCHEMA, self.Validator.META_SCHEMA)
self.assertEqual(Extended.DEFAULT_TYPES, self.Validator.DEFAULT_TYPES)
class TestIterErrors(unittest.TestCase):
def setUp(self):
self.validator = Draft3Validator({})
def test_iter_errors(self):
instance = [1, 2]
schema = {
u"disallow" : u"array",
u"enum" : [["a", "b", "c"], ["d", "e", "f"]],
u"minItems" : 3
}
got = (e.message for e in self.validator.iter_errors(instance, schema))
expected = [
"%r is disallowed for [1, 2]" % (schema["disallow"],),
"[1, 2] is too short",
"[1, 2] is not one of %r" % (schema["enum"],),
]
self.assertEqual(sorted(got), sorted(expected))
def test_iter_errors_multiple_failures_one_validator(self):
instance = {"foo" : 2, "bar" : [1], "baz" : 15, "quux" : "spam"}
schema = {
u"properties" : {
"foo" : {u"type" : "string"},
"bar" : {u"minItems" : 2},
"baz" : {u"maximum" : 10, u"enum" : [2, 4, 6, 8]},
}
}
errors = list(self.validator.iter_errors(instance, schema))
self.assertEqual(len(errors), 4)
class TestValidationErrorMessages(unittest.TestCase):
def message_for(self, instance, schema, *args, **kwargs):
kwargs.setdefault("cls", Draft3Validator)
with self.assertRaises(ValidationError) as e:
validate(instance, schema, *args, **kwargs)
return e.exception.message
def test_single_type_failure(self):
message = self.message_for(instance=1, schema={u"type" : u"string"})
self.assertEqual(message, "1 is not of type %r" % u"string")
def test_single_type_list_failure(self):
message = self.message_for(instance=1, schema={u"type" : [u"string"]})
self.assertEqual(message, "1 is not of type %r" % u"string")
def test_multiple_type_failure(self):
types = u"string", u"object"
message = self.message_for(instance=1, schema={u"type" : list(types)})
self.assertEqual(message, "1 is not of type %r, %r" % types)
def test_object_without_title_type_failure(self):
type = {u"type" : [{u"minimum" : 3}]}
message = self.message_for(instance=1, schema={u"type" : [type]})
self.assertEqual(message, "1 is not of type %r" % (type,))
def test_object_with_name_type_failure(self):
name = "Foo"
schema = {u"type" : [{u"name" : name, u"minimum" : 3}]}
message = self.message_for(instance=1, schema=schema)
self.assertEqual(message, "1 is not of type %r" % (name,))
def test_minimum(self):
message = self.message_for(instance=1, schema={"minimum" : 2})
self.assertEqual(message, "1 is less than the minimum of 2")
def test_maximum(self):
message = self.message_for(instance=1, schema={"maximum" : 0})
self.assertEqual(message, "1 is greater than the maximum of 0")
def test_dependencies_failure_has_single_element_not_list(self):
depend, on = "bar", "foo"
schema = {u"dependencies" : {depend : on}}
message = self.message_for({"bar" : 2}, schema)
self.assertEqual(message, "%r is a dependency of %r" % (on, depend))
def test_additionalItems_single_failure(self):
message = self.message_for(
[2], {u"items" : [], u"additionalItems" : False},
)
self.assertIn("(2 was unexpected)", message)
def test_additionalItems_multiple_failures(self):
message = self.message_for(
[1, 2, 3], {u"items" : [], u"additionalItems" : False}
)
self.assertIn("(1, 2, 3 were unexpected)", message)
def test_additionalProperties_single_failure(self):
additional = "foo"
schema = {u"additionalProperties" : False}
message = self.message_for({additional : 2}, schema)
self.assertIn("(%r was unexpected)" % (additional,), message)
def test_additionalProperties_multiple_failures(self):
schema = {u"additionalProperties" : False}
message = self.message_for(dict.fromkeys(["foo", "bar"]), schema)
self.assertIn(repr("foo"), message)
self.assertIn(repr("bar"), message)
self.assertIn("were unexpected)", message)
def test_invalid_format_default_message(self):
checker = FormatChecker(formats=())
check_fn = mock.Mock(return_value=False)
checker.checks(u"thing")(check_fn)
schema = {u"format" : u"thing"}
message = self.message_for("bla", schema, format_checker=checker)
self.assertIn(repr("bla"), message)
self.assertIn(repr("thing"), message)
self.assertIn("is not a", message)
class TestErrorReprStr(unittest.TestCase):
def make_error(self, **kwargs):
defaults = dict(
message=u"hello",
validator=u"type",
validator_value=u"string",
instance=5,
schema={u"type": u"string"},
)
defaults.update(kwargs)
return ValidationError(**defaults)
def assertShows(self, expected, **kwargs):
if PY3:
expected = expected.replace("u'", "'")
expected = textwrap.dedent(expected).rstrip("\n")
error = self.make_error(**kwargs)
message_line, _, rest = str(error).partition("\n")
self.assertEqual(message_line, error.message)
self.assertEqual(rest, expected)
def test_repr(self):
self.assertEqual(
repr(ValidationError(message="Hello!")),
"<ValidationError: %r>" % "Hello!",
)
def test_unset_error(self):
error = ValidationError("message")
self.assertEqual(str(error), "message")
kwargs = {
"validator": "type",
"validator_value": "string",
"instance": 5,
"schema": {"type": "string"}
}
# Just the message should show if any of the attributes are unset
for attr in kwargs:
k = dict(kwargs)
del k[attr]
error = ValidationError("message", **k)
self.assertEqual(str(error), "message")
def test_empty_paths(self):
self.assertShows(
"""
Failed validating u'type' in schema:
{u'type': u'string'}
On instance:
5
""",
path=[],
schema_path=[],
)
def test_one_item_paths(self):
self.assertShows(
"""
Failed validating u'type' in schema:
{u'type': u'string'}
On instance[0]:
5
""",
path=[0],
schema_path=["items"],
)
def test_multiple_item_paths(self):
self.assertShows(
"""
Failed validating u'type' in schema[u'items'][0]:
{u'type': u'string'}
On instance[0][u'a']:
5
""",
path=[0, u"a"],
schema_path=[u"items", 0, 1],
)
def test_uses_pprint(self):
with mock.patch("pprint.pformat") as pformat:
str(self.make_error())
self.assertEqual(pformat.call_count, 2) # schema + instance
class TestValidationErrorDetails(unittest.TestCase):
# TODO: These really need unit tests for each individual validator, rather
# than just these higher level tests.
def test_anyOf(self):
instance = 5
schema = {
"anyOf": [
{"minimum": 20},
{"type": "string"}
]
}
validator = Draft4Validator(schema)
errors = list(validator.iter_errors(instance))
self.assertEqual(len(errors), 1)
e = errors[0]
self.assertEqual(e.validator, "anyOf")
self.assertEqual(e.validator_value, schema["anyOf"])
self.assertEqual(e.instance, instance)
self.assertEqual(e.schema, schema)
self.assertIsNone(e.parent)
self.assertEqual(e.path, deque([]))
self.assertEqual(e.relative_path, deque([]))
self.assertEqual(e.absolute_path, deque([]))
self.assertEqual(e.schema_path, deque(["anyOf"]))
self.assertEqual(e.relative_schema_path, deque(["anyOf"]))
self.assertEqual(e.absolute_schema_path, deque(["anyOf"]))
self.assertEqual(len(e.context), 2)
e1, e2 = sorted_errors(e.context)
self.assertEqual(e1.validator, "minimum")
self.assertEqual(e1.validator_value, schema["anyOf"][0]["minimum"])
self.assertEqual(e1.instance, instance)
self.assertEqual(e1.schema, schema["anyOf"][0])
self.assertIs(e1.parent, e)
self.assertEqual(e1.path, deque([]))
self.assertEqual(e1.absolute_path, deque([]))
self.assertEqual(e1.relative_path, deque([]))
self.assertEqual(e1.schema_path, deque([0, "minimum"]))
self.assertEqual(e1.relative_schema_path, deque([0, "minimum"]))
self.assertEqual(
e1.absolute_schema_path, deque(["anyOf", 0, "minimum"]),
)
self.assertFalse(e1.context)
self.assertEqual(e2.validator, "type")
self.assertEqual(e2.validator_value, schema["anyOf"][1]["type"])
self.assertEqual(e2.instance, instance)
self.assertEqual(e2.schema, schema["anyOf"][1])
self.assertIs(e2.parent, e)
self.assertEqual(e2.path, deque([]))
self.assertEqual(e2.relative_path, deque([]))
self.assertEqual(e2.absolute_path, deque([]))
self.assertEqual(e2.schema_path, deque([1, "type"]))
self.assertEqual(e2.relative_schema_path, deque([1, "type"]))
self.assertEqual(e2.absolute_schema_path, deque(["anyOf", 1, "type"]))
self.assertEqual(len(e2.context), 0)
def test_type(self):
instance = {"foo": 1}
schema = {
"type": [
{"type": "integer"},
{
"type": "object",
"properties": {
"foo": {"enum": [2]}
}
}
]
}
validator = Draft3Validator(schema)
errors = list(validator.iter_errors(instance))
self.assertEqual(len(errors), 1)
e = errors[0]
self.assertEqual(e.validator, "type")
self.assertEqual(e.validator_value, schema["type"])
self.assertEqual(e.instance, instance)
self.assertEqual(e.schema, schema)
self.assertIsNone(e.parent)
self.assertEqual(e.path, deque([]))
self.assertEqual(e.relative_path, deque([]))
self.assertEqual(e.absolute_path, deque([]))
self.assertEqual(e.schema_path, deque(["type"]))
self.assertEqual(e.relative_schema_path, deque(["type"]))
self.assertEqual(e.absolute_schema_path, deque(["type"]))
self.assertEqual(len(e.context), 2)
e1, e2 = sorted_errors(e.context)
self.assertEqual(e1.validator, "type")
self.assertEqual(e1.validator_value, schema["type"][0]["type"])
self.assertEqual(e1.instance, instance)
self.assertEqual(e1.schema, schema["type"][0])
self.assertIs(e1.parent, e)
self.assertEqual(e1.path, deque([]))
self.assertEqual(e1.relative_path, deque([]))
self.assertEqual(e1.absolute_path, deque([]))
self.assertEqual(e1.schema_path, deque([0, "type"]))
self.assertEqual(e1.relative_schema_path, deque([0, "type"]))
self.assertEqual(e1.absolute_schema_path, deque(["type", 0, "type"]))
self.assertFalse(e1.context)
self.assertEqual(e2.validator, "enum")
self.assertEqual(e2.validator_value, [2])
self.assertEqual(e2.instance, 1)
self.assertEqual(e2.schema, {u"enum" : [2]})
self.assertIs(e2.parent, e)
self.assertEqual(e2.path, deque(["foo"]))
self.assertEqual(e2.relative_path, deque(["foo"]))
self.assertEqual(e2.absolute_path, deque(["foo"]))
self.assertEqual(
e2.schema_path, deque([1, "properties", "foo", "enum"]),
)
self.assertEqual(
e2.relative_schema_path, deque([1, "properties", "foo", "enum"]),
)
self.assertEqual(
e2.absolute_schema_path,
deque(["type", 1, "properties", "foo", "enum"]),
)
self.assertFalse(e2.context)
def test_single_nesting(self):
instance = {"foo" : 2, "bar" : [1], "baz" : 15, "quux" : "spam"}
schema = {
"properties" : {
"foo" : {"type" : "string"},
"bar" : {"minItems" : 2},
"baz" : {"maximum" : 10, "enum" : [2, 4, 6, 8]},
}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2, e3, e4 = sorted_errors(errors)
self.assertEqual(e1.path, deque(["bar"]))
self.assertEqual(e2.path, deque(["baz"]))
self.assertEqual(e3.path, deque(["baz"]))
self.assertEqual(e4.path, deque(["foo"]))
self.assertEqual(e1.relative_path, deque(["bar"]))
self.assertEqual(e2.relative_path, deque(["baz"]))
self.assertEqual(e3.relative_path, deque(["baz"]))
self.assertEqual(e4.relative_path, deque(["foo"]))
self.assertEqual(e1.absolute_path, deque(["bar"]))
self.assertEqual(e2.absolute_path, deque(["baz"]))
self.assertEqual(e3.absolute_path, deque(["baz"]))
self.assertEqual(e4.absolute_path, deque(["foo"]))
self.assertEqual(e1.validator, "minItems")
self.assertEqual(e2.validator, "enum")
self.assertEqual(e3.validator, "maximum")
self.assertEqual(e4.validator, "type")
def test_multiple_nesting(self):
instance = [1, {"foo" : 2, "bar" : {"baz" : [1]}}, "quux"]
schema = {
"type" : "string",
"items" : {
"type" : ["string", "object"],
"properties" : {
"foo" : {"enum" : [1, 3]},
"bar" : {
"type" : "array",
"properties" : {
"bar" : {"required" : True},
"baz" : {"minItems" : 2},
}
}
}
}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2, e3, e4, e5, e6 = sorted_errors(errors)
self.assertEqual(e1.path, deque([]))
self.assertEqual(e2.path, deque([0]))
self.assertEqual(e3.path, deque([1, "bar"]))
self.assertEqual(e4.path, deque([1, "bar", "bar"]))
self.assertEqual(e5.path, deque([1, "bar", "baz"]))
self.assertEqual(e6.path, deque([1, "foo"]))
self.assertEqual(e1.schema_path, deque(["type"]))
self.assertEqual(e2.schema_path, deque(["items", "type"]))
self.assertEqual(
list(e3.schema_path), ["items", "properties", "bar", "type"],
)
self.assertEqual(
list(e4.schema_path),
["items", "properties", "bar", "properties", "bar", "required"],
)
self.assertEqual(
list(e5.schema_path),
["items", "properties", "bar", "properties", "baz", "minItems"]
)
self.assertEqual(
list(e6.schema_path), ["items", "properties", "foo", "enum"],
)
self.assertEqual(e1.validator, "type")
self.assertEqual(e2.validator, "type")
self.assertEqual(e3.validator, "type")
self.assertEqual(e4.validator, "required")
self.assertEqual(e5.validator, "minItems")
self.assertEqual(e6.validator, "enum")
def test_additionalProperties(self):
instance = {"bar": "bar", "foo": 2}
schema = {
"additionalProperties" : {"type": "integer", "minimum": 5}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2 = sorted_errors(errors)
self.assertEqual(e1.path, deque(["bar"]))
self.assertEqual(e2.path, deque(["foo"]))
self.assertEqual(e1.validator, "type")
self.assertEqual(e2.validator, "minimum")
def test_patternProperties(self):
instance = {"bar": 1, "foo": 2}
schema = {
"patternProperties" : {
"bar": {"type": "string"},
"foo": {"minimum": 5}
}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2 = sorted_errors(errors)
self.assertEqual(e1.path, deque(["bar"]))
self.assertEqual(e2.path, deque(["foo"]))
self.assertEqual(e1.validator, "type")
self.assertEqual(e2.validator, "minimum")
def test_additionalItems(self):
instance = ["foo", 1]
schema = {
"items": [],
"additionalItems" : {"type": "integer", "minimum": 5}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2 = sorted_errors(errors)
self.assertEqual(e1.path, deque([0]))
self.assertEqual(e2.path, deque([1]))
self.assertEqual(e1.validator, "type")
self.assertEqual(e2.validator, "minimum")
def test_additionalItems_with_items(self):
instance = ["foo", "bar", 1]
schema = {
"items": [{}],
"additionalItems" : {"type": "integer", "minimum": 5}
}
validator = Draft3Validator(schema)
errors = validator.iter_errors(instance)
e1, e2 = sorted_errors(errors)
self.assertEqual(e1.path, deque([1]))
self.assertEqual(e2.path, deque([2]))
self.assertEqual(e1.validator, "type")
self.assertEqual(e2.validator, "minimum")
class ValidatorTestMixin(object):
def setUp(self):
self.instance = mock.Mock()
self.schema = {}
self.resolver = mock.Mock()
self.validator = self.validator_class(self.schema)
def test_valid_instances_are_valid(self):
errors = iter([])
with mock.patch.object(
self.validator, "iter_errors", return_value=errors,
):
self.assertTrue(
self.validator.is_valid(self.instance, self.schema)
)
def test_invalid_instances_are_not_valid(self):
errors = iter([mock.Mock()])
with mock.patch.object(
self.validator, "iter_errors", return_value=errors,
):
self.assertFalse(
self.validator.is_valid(self.instance, self.schema)
)
def test_non_existent_properties_are_ignored(self):
instance, my_property, my_value = mock.Mock(), mock.Mock(), mock.Mock()
validate(instance=instance, schema={my_property : my_value})
def test_it_creates_a_ref_resolver_if_not_provided(self):
self.assertIsInstance(self.validator.resolver, RefResolver)
def test_it_delegates_to_a_ref_resolver(self):
resolver = RefResolver("", {})
schema = {"$ref" : mock.Mock()}
@contextmanager
def resolving():
yield {"type": "integer"}
with mock.patch.object(resolver, "resolving") as resolve:
resolve.return_value = resolving()
with self.assertRaises(ValidationError):
self.validator_class(schema, resolver=resolver).validate(None)
resolve.assert_called_once_with(schema["$ref"])
def test_is_type_is_true_for_valid_type(self):
self.assertTrue(self.validator.is_type("foo", "string"))
def test_is_type_is_false_for_invalid_type(self):
self.assertFalse(self.validator.is_type("foo", "array"))
def test_is_type_evades_bool_inheriting_from_int(self):
self.assertFalse(self.validator.is_type(True, "integer"))
self.assertFalse(self.validator.is_type(True, "number"))
def test_is_type_raises_exception_for_unknown_type(self):
with self.assertRaises(UnknownType):
self.validator.is_type("foo", object())
class TestDraft3Validator(ValidatorTestMixin, unittest.TestCase):
validator_class = Draft3Validator
def test_is_type_is_true_for_any_type(self):
self.assertTrue(self.validator.is_valid(mock.Mock(), {"type": "any"}))
def test_is_type_does_not_evade_bool_if_it_is_being_tested(self):
self.assertTrue(self.validator.is_type(True, "boolean"))
self.assertTrue(self.validator.is_valid(True, {"type": "any"}))
def test_non_string_custom_types(self):
schema = {'type': [None]}
cls = self.validator_class(schema, types={None: type(None)})
cls.validate(None, schema)
class TestDraft4Validator(ValidatorTestMixin, unittest.TestCase):
validator_class = Draft4Validator
class TestBuiltinFormats(unittest.TestCase):
"""
The built-in (specification-defined) formats do not raise type errors.
If an instance or value is not a string, it should be ignored.
"""
for format in FormatChecker.checkers:
def test(self, format=format):
v = Draft4Validator({"format": format}, format_checker=FormatChecker())
v.validate(123)
name = "test_{0}_ignores_non_strings".format(format)
test.__name__ = name
setattr(TestBuiltinFormats, name, test)
del test # Ugh py.test. Stop discovering top level tests.
class TestValidatorFor(unittest.TestCase):
def test_draft_3(self):
schema = {"$schema" : "http://json-schema.org/draft-03/schema"}
self.assertIs(validator_for(schema), Draft3Validator)
schema = {"$schema" : "http://json-schema.org/draft-03/schema#"}
self.assertIs(validator_for(schema), Draft3Validator)
def test_draft_4(self):
schema = {"$schema" : "http://json-schema.org/draft-04/schema"}
self.assertIs(validator_for(schema), Draft4Validator)
schema = {"$schema" : "http://json-schema.org/draft-04/schema#"}
self.assertIs(validator_for(schema), Draft4Validator)
def test_custom_validator(self):
Validator = create(meta_schema={"id" : "meta schema id"}, version="12")
schema = {"$schema" : "meta schema id"}
self.assertIs(validator_for(schema), Validator)
def test_validator_for_jsonschema_default(self):
self.assertIs(validator_for({}), Draft4Validator)
def test_validator_for_custom_default(self):
self.assertIs(validator_for({}, default=None), None)
class TestValidate(unittest.TestCase):
def test_draft3_validator_is_chosen(self):
schema = {"$schema" : "http://json-schema.org/draft-03/schema#"}
with mock.patch.object(Draft3Validator, "check_schema") as chk_schema:
validate({}, schema)
chk_schema.assert_called_once_with(schema)
# Make sure it works without the empty fragment
schema = {"$schema" : "http://json-schema.org/draft-03/schema"}
with mock.patch.object(Draft3Validator, "check_schema") as chk_schema:
validate({}, schema)
chk_schema.assert_called_once_with(schema)
def test_draft4_validator_is_chosen(self):
schema = {"$schema" : "http://json-schema.org/draft-04/schema#"}
with mock.patch.object(Draft4Validator, "check_schema") as chk_schema:
validate({}, schema)
chk_schema.assert_called_once_with(schema)
def test_draft4_validator_is_the_default(self):
with mock.patch.object(Draft4Validator, "check_schema") as chk_schema:
validate({}, {})
chk_schema.assert_called_once_with({})
class TestRefResolver(unittest.TestCase):
base_uri = ""
stored_uri = "foo://stored"
stored_schema = {"stored" : "schema"}
def setUp(self):
self.referrer = {}
self.store = {self.stored_uri : self.stored_schema}
self.resolver = RefResolver(self.base_uri, self.referrer, self.store)
def test_it_does_not_retrieve_schema_urls_from_the_network(self):
ref = Draft3Validator.META_SCHEMA["id"]
with mock.patch.object(self.resolver, "resolve_remote") as remote:
with self.resolver.resolving(ref) as resolved:
self.assertEqual(resolved, Draft3Validator.META_SCHEMA)
self.assertFalse(remote.called)
def test_it_resolves_local_refs(self):
ref = "#/properties/foo"
self.referrer["properties"] = {"foo" : object()}
with self.resolver.resolving(ref) as resolved:
self.assertEqual(resolved, self.referrer["properties"]["foo"])
def test_it_resolves_local_refs_with_id(self):
schema = {"id": "foo://bar/schema#", "a": {"foo": "bar"}}
resolver = RefResolver.from_schema(schema)
with resolver.resolving("#/a") as resolved:
self.assertEqual(resolved, schema["a"])
with resolver.resolving("foo://bar/schema#/a") as resolved:
self.assertEqual(resolved, schema["a"])
def test_it_retrieves_stored_refs(self):
with self.resolver.resolving(self.stored_uri) as resolved:
self.assertIs(resolved, self.stored_schema)
self.resolver.store["cached_ref"] = {"foo" : 12}
with self.resolver.resolving("cached_ref#/foo") as resolved:
self.assertEqual(resolved, 12)
def test_it_retrieves_unstored_refs_via_requests(self):
ref = "http://bar#baz"
schema = {"baz" : 12}
with mock.patch("jsonschema.validators.requests") as requests:
requests.get.return_value.json.return_value = schema
with self.resolver.resolving(ref) as resolved:
self.assertEqual(resolved, 12)
requests.get.assert_called_once_with("http://bar")
def test_it_retrieves_unstored_refs_via_urlopen(self):
ref = "http://bar#baz"
schema = {"baz" : 12}
with mock.patch("jsonschema.validators.requests", None):
with mock.patch("jsonschema.validators.urlopen") as urlopen:
urlopen.return_value.read.return_value = (
json.dumps(schema).encode("utf8"))
with self.resolver.resolving(ref) as resolved:
self.assertEqual(resolved, 12)
urlopen.assert_called_once_with("http://bar")
def test_it_can_construct_a_base_uri_from_a_schema(self):
schema = {"id" : "foo"}
resolver = RefResolver.from_schema(schema)
self.assertEqual(resolver.base_uri, "foo")
with resolver.resolving("") as resolved:
self.assertEqual(resolved, schema)
with resolver.resolving("#") as resolved:
self.assertEqual(resolved, schema)
with resolver.resolving("foo") as resolved:
self.assertEqual(resolved, schema)
with resolver.resolving("foo#") as resolved:
self.assertEqual(resolved, schema)
def test_it_can_construct_a_base_uri_from_a_schema_without_id(self):
schema = {}
resolver = RefResolver.from_schema(schema)
self.assertEqual(resolver.base_uri, "")
with resolver.resolving("") as resolved:
self.assertEqual(resolved, schema)
with resolver.resolving("#") as resolved:
self.assertEqual(resolved, schema)
def test_custom_uri_scheme_handlers(self):
schema = {"foo": "bar"}
ref = "foo://bar"
foo_handler = mock.Mock(return_value=schema)
resolver = RefResolver("", {}, handlers={"foo": foo_handler})
with resolver.resolving(ref) as resolved:
self.assertEqual(resolved, schema)
foo_handler.assert_called_once_with(ref)
def test_cache_remote_on(self):
ref = "foo://bar"
foo_handler = mock.Mock()
resolver = RefResolver(
"", {}, cache_remote=True, handlers={"foo" : foo_handler},
)
with resolver.resolving(ref):
pass
with resolver.resolving(ref):
pass
foo_handler.assert_called_once_with(ref)
def test_cache_remote_off(self):
ref = "foo://bar"
foo_handler = mock.Mock()
resolver = RefResolver(
"", {}, cache_remote=False, handlers={"foo" : foo_handler},
)
with resolver.resolving(ref):
pass
with resolver.resolving(ref):
pass
self.assertEqual(foo_handler.call_count, 2)
def test_if_you_give_it_junk_you_get_a_resolution_error(self):
ref = "foo://bar"
foo_handler = mock.Mock(side_effect=ValueError("Oh no! What's this?"))
resolver = RefResolver("", {}, handlers={"foo" : foo_handler})
with self.assertRaises(RefResolutionError) as err:
with resolver.resolving(ref):
pass
self.assertEqual(str(err.exception), "Oh no! What's this?")
def sorted_errors(errors):
def key(error):
return (
[str(e) for e in error.path],
[str(e) for e in error.schema_path]
)
return sorted(errors, key=key)

View File

@ -1,428 +0,0 @@
from __future__ import division
import contextlib
import json
import numbers
try:
import requests
except ImportError:
requests = None
from jsonschema import _utils, _validators
from jsonschema.compat import (
Sequence, urljoin, urlsplit, urldefrag, unquote, urlopen,
str_types, int_types, iteritems,
)
from jsonschema.exceptions import ErrorTree # Backwards compatibility # noqa
from jsonschema.exceptions import RefResolutionError, SchemaError, UnknownType
_unset = _utils.Unset()
validators = {}
meta_schemas = _utils.URIDict()
def validates(version):
"""
Register the decorated validator for a ``version`` of the specification.
Registered validators and their meta schemas will be considered when
parsing ``$schema`` properties' URIs.
:argument str version: an identifier to use as the version's name
:returns: a class decorator to decorate the validator with the version
"""
def _validates(cls):
validators[version] = cls
if u"id" in cls.META_SCHEMA:
meta_schemas[cls.META_SCHEMA[u"id"]] = cls
return cls
return _validates
def create(meta_schema, validators=(), version=None, default_types=None): # noqa
if default_types is None:
default_types = {
u"array" : list, u"boolean" : bool, u"integer" : int_types,
u"null" : type(None), u"number" : numbers.Number, u"object" : dict,
u"string" : str_types,
}
class Validator(object):
VALIDATORS = dict(validators)
META_SCHEMA = dict(meta_schema)
DEFAULT_TYPES = dict(default_types)
def __init__(
self, schema, types=(), resolver=None, format_checker=None,
):
self._types = dict(self.DEFAULT_TYPES)
self._types.update(types)
if resolver is None:
resolver = RefResolver.from_schema(schema)
self.resolver = resolver
self.format_checker = format_checker
self.schema = schema
@classmethod
def check_schema(cls, schema):
for error in cls(cls.META_SCHEMA).iter_errors(schema):
raise SchemaError.create_from(error)
def iter_errors(self, instance, _schema=None):
if _schema is None:
_schema = self.schema
with self.resolver.in_scope(_schema.get(u"id", u"")):
ref = _schema.get(u"$ref")
if ref is not None:
validators = [(u"$ref", ref)]
else:
validators = iteritems(_schema)
for k, v in validators:
validator = self.VALIDATORS.get(k)
if validator is None:
continue
errors = validator(self, v, instance, _schema) or ()
for error in errors:
# set details if not already set by the called fn
error._set(
validator=k,
validator_value=v,
instance=instance,
schema=_schema,
)
if k != u"$ref":
error.schema_path.appendleft(k)
yield error
def descend(self, instance, schema, path=None, schema_path=None):
for error in self.iter_errors(instance, schema):
if path is not None:
error.path.appendleft(path)
if schema_path is not None:
error.schema_path.appendleft(schema_path)
yield error
def validate(self, *args, **kwargs):
for error in self.iter_errors(*args, **kwargs):
raise error
def is_type(self, instance, type):
if type not in self._types:
raise UnknownType(type, instance, self.schema)
pytypes = self._types[type]
# bool inherits from int, so ensure bools aren't reported as ints
if isinstance(instance, bool):
pytypes = _utils.flatten(pytypes)
is_number = any(
issubclass(pytype, numbers.Number) for pytype in pytypes
)
if is_number and bool not in pytypes:
return False
return isinstance(instance, pytypes)
def is_valid(self, instance, _schema=None):
error = next(self.iter_errors(instance, _schema), None)
return error is None
if version is not None:
Validator = validates(version)(Validator)
Validator.__name__ = version.title().replace(" ", "") + "Validator"
return Validator
def extend(validator, validators, version=None):
all_validators = dict(validator.VALIDATORS)
all_validators.update(validators)
return create(
meta_schema=validator.META_SCHEMA,
validators=all_validators,
version=version,
default_types=validator.DEFAULT_TYPES,
)
Draft3Validator = create(
meta_schema=_utils.load_schema("draft3"),
validators={
u"$ref" : _validators.ref,
u"additionalItems" : _validators.additionalItems,
u"additionalProperties" : _validators.additionalProperties,
u"dependencies" : _validators.dependencies,
u"disallow" : _validators.disallow_draft3,
u"divisibleBy" : _validators.multipleOf,
u"enum" : _validators.enum,
u"extends" : _validators.extends_draft3,
u"format" : _validators.format,
u"items" : _validators.items,
u"maxItems" : _validators.maxItems,
u"maxLength" : _validators.maxLength,
u"maximum" : _validators.maximum,
u"minItems" : _validators.minItems,
u"minLength" : _validators.minLength,
u"minimum" : _validators.minimum,
u"multipleOf" : _validators.multipleOf,
u"pattern" : _validators.pattern,
u"patternProperties" : _validators.patternProperties,
u"properties" : _validators.properties_draft3,
u"type" : _validators.type_draft3,
u"uniqueItems" : _validators.uniqueItems,
},
version="draft3",
)
Draft4Validator = create(
meta_schema=_utils.load_schema("draft4"),
validators={
u"$ref" : _validators.ref,
u"additionalItems" : _validators.additionalItems,
u"additionalProperties" : _validators.additionalProperties,
u"allOf" : _validators.allOf_draft4,
u"anyOf" : _validators.anyOf_draft4,
u"dependencies" : _validators.dependencies,
u"enum" : _validators.enum,
u"format" : _validators.format,
u"items" : _validators.items,
u"maxItems" : _validators.maxItems,
u"maxLength" : _validators.maxLength,
u"maxProperties" : _validators.maxProperties_draft4,
u"maximum" : _validators.maximum,
u"minItems" : _validators.minItems,
u"minLength" : _validators.minLength,
u"minProperties" : _validators.minProperties_draft4,
u"minimum" : _validators.minimum,
u"multipleOf" : _validators.multipleOf,
u"not" : _validators.not_draft4,
u"oneOf" : _validators.oneOf_draft4,
u"pattern" : _validators.pattern,
u"patternProperties" : _validators.patternProperties,
u"properties" : _validators.properties_draft4,
u"required" : _validators.required_draft4,
u"type" : _validators.type_draft4,
u"uniqueItems" : _validators.uniqueItems,
},
version="draft4",
)
class RefResolver(object):
"""
Resolve JSON References.
:argument str base_uri: URI of the referring document
:argument referrer: the actual referring document
:argument dict store: a mapping from URIs to documents to cache
:argument bool cache_remote: whether remote refs should be cached after
first resolution
:argument dict handlers: a mapping from URI schemes to functions that
should be used to retrieve them
"""
def __init__(
self, base_uri, referrer, store=(), cache_remote=True, handlers=(),
):
self.base_uri = base_uri
self.resolution_scope = base_uri
# This attribute is not used, it is for backwards compatibility
self.referrer = referrer
self.cache_remote = cache_remote
self.handlers = dict(handlers)
self.store = _utils.URIDict(
(id, validator.META_SCHEMA)
for id, validator in iteritems(meta_schemas)
)
self.store.update(store)
self.store[base_uri] = referrer
@classmethod
def from_schema(cls, schema, *args, **kwargs):
"""
Construct a resolver from a JSON schema object.
:argument schema schema: the referring schema
:rtype: :class:`RefResolver`
"""
return cls(schema.get(u"id", u""), schema, *args, **kwargs)
@contextlib.contextmanager
def in_scope(self, scope):
old_scope = self.resolution_scope
self.resolution_scope = urljoin(old_scope, scope)
try:
yield
finally:
self.resolution_scope = old_scope
@contextlib.contextmanager
def resolving(self, ref):
"""
Context manager which resolves a JSON ``ref`` and enters the
resolution scope of this ref.
:argument str ref: reference to resolve
"""
full_uri = urljoin(self.resolution_scope, ref)
uri, fragment = urldefrag(full_uri)
if not uri:
uri = self.base_uri
if uri in self.store:
document = self.store[uri]
else:
try:
document = self.resolve_remote(uri)
except Exception as exc:
raise RefResolutionError(exc)
old_base_uri, self.base_uri = self.base_uri, uri
try:
with self.in_scope(uri):
yield self.resolve_fragment(document, fragment)
finally:
self.base_uri = old_base_uri
def resolve_fragment(self, document, fragment):
"""
Resolve a ``fragment`` within the referenced ``document``.
:argument document: the referrant document
:argument str fragment: a URI fragment to resolve within it
"""
fragment = fragment.lstrip(u"/")
parts = unquote(fragment).split(u"/") if fragment else []
for part in parts:
part = part.replace(u"~1", u"/").replace(u"~0", u"~")
if isinstance(document, Sequence):
# Array indexes should be turned into integers
try:
part = int(part)
except ValueError:
pass
try:
document = document[part]
except (TypeError, LookupError):
raise RefResolutionError(
"Unresolvable JSON pointer: %r" % fragment
)
return document
def resolve_remote(self, uri):
"""
Resolve a remote ``uri``.
Does not check the store first, but stores the retrieved document in
the store if :attr:`RefResolver.cache_remote` is True.
.. note::
If the requests_ library is present, ``jsonschema`` will use it to
request the remote ``uri``, so that the correct encoding is
detected and used.
If it isn't, or if the scheme of the ``uri`` is not ``http`` or
``https``, UTF-8 is assumed.
:argument str uri: the URI to resolve
:returns: the retrieved document
.. _requests: http://pypi.python.org/pypi/requests/
"""
scheme = urlsplit(uri).scheme
if scheme in self.handlers:
result = self.handlers[scheme](uri)
elif (
scheme in [u"http", u"https"] and
requests and
getattr(requests.Response, "json", None) is not None
):
# Requests has support for detecting the correct encoding of
# json over http
if callable(requests.Response.json):
result = requests.get(uri).json()
else:
result = requests.get(uri).json
else:
# Otherwise, pass off to urllib and assume utf-8
result = json.loads(urlopen(uri).read().decode("utf-8"))
if self.cache_remote:
self.store[uri] = result
return result
def validator_for(schema, default=_unset):
if default is _unset:
default = Draft4Validator
return meta_schemas.get(schema.get(u"$schema", u""), default)
def validate(instance, schema, cls=None, *args, **kwargs):
"""
Validate an instance under the given schema.
>>> validate([2, 3, 4], {"maxItems" : 2})
Traceback (most recent call last):
...
ValidationError: [2, 3, 4] is too long
:func:`validate` will first verify that the provided schema is itself
valid, since not doing so can lead to less obvious error messages and fail
in less obvious or consistent ways. If you know you have a valid schema
already or don't care, you might prefer using the
:meth:`~IValidator.validate` method directly on a specific validator
(e.g. :meth:`Draft4Validator.validate`).
:argument instance: the instance to validate
:argument schema: the schema to validate with
:argument cls: an :class:`IValidator` class that will be used to validate
the instance.
If the ``cls`` argument is not provided, two things will happen in
accordance with the specification. First, if the schema has a
:validator:`$schema` property containing a known meta-schema [#]_ then the
proper validator will be used. The specification recommends that all
schemas contain :validator:`$schema` properties for this reason. If no
:validator:`$schema` property is found, the default validator class is
:class:`Draft4Validator`.
Any other provided positional and keyword arguments will be passed on when
instantiating the ``cls``.
:raises:
:exc:`ValidationError` if the instance is invalid
:exc:`SchemaError` if the schema itself is invalid
.. rubric:: Footnotes
.. [#] known by a validator registered with :func:`validates`
"""
if cls is None:
cls = validator_for(schema)
cls.check_schema(schema)
cls(schema, *args, **kwargs).validate(instance)

View File

@ -1,46 +0,0 @@
#! /usr/bin/env python
"""
A *very* basic performance test.
"""
from __future__ import print_function
import argparse
import textwrap
import timeit
IMPORT = "from jsonschema import Draft3Validator, Draft4Validator, validate\n"
parser = argparse.ArgumentParser()
parser.add_argument("-n", "--number", type=int, default=100)
arguments = parser.parse_args()
print("Validating {0} times.".format(arguments.number))
for name, benchmark in (
(
"Simple", """
validator = Draft3Validator(
{"type" : "object", "properties" : {"foo" : {"required" : True}}}
)
instance = {"foo" : 12, "bar" : 13}
"""
),
(
"Meta schema", """
validator = Draft3Validator(Draft3Validator.META_SCHEMA)
instance = validator.META_SCHEMA
"""
),
):
results = timeit.timeit(
number=arguments.number,
setup=IMPORT + textwrap.dedent(benchmark),
stmt="validator.validate(instance)",
)
print("{0:15}: {1} seconds".format(name, results))

View File

@ -1,2 +0,0 @@
[wheel]
universal = 1

View File

@ -1,39 +0,0 @@
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
from jsonschema import __version__
with open("README.rst") as readme:
long_description = readme.read()
classifiers = [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 2.6",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.3",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
]
setup(
name="jsonschema",
version=__version__,
packages=["jsonschema", "jsonschema.tests"],
package_data={"jsonschema": ["schemas/*.json"]},
author="Julian Berman",
author_email="Julian@GrayVines.com",
classifiers=classifiers,
description="An implementation of JSON Schema validation for Python",
license="MIT",
long_description=long_description,
url="http://github.com/Julian/jsonschema",
)

View File

@ -32,6 +32,17 @@
}
]
},
{
"description": "integer comparison",
"schema": {"maximum": 18446744073709551615},
"tests": [
{
"description": "comparison works for high numbers",
"data": 18446744073709551600,
"valid": true
}
]
},
{
"description": "float comparison with high precision",
"schema": {

View File

@ -32,6 +32,17 @@
}
]
},
{
"description": "integer comparison",
"schema": {"maximum": 18446744073709551615},
"tests": [
{
"description": "comparison works for high numbers",
"data": 18446744073709551600,
"valid": true
}
]
},
{
"description": "float comparison with high precision",
"schema": {

72
tox.ini
View File

@ -1,72 +0,0 @@
[tox]
envlist = py26, py27, pypy, py33, py34, docs, style
[testenv]
commands =
py.test [] -s jsonschema
{envpython} -m doctest README.rst
deps =
{[testenv:notpy33]deps}
{[testenv:py33]deps}
[testenv:coverage]
commands =
coverage run --branch --source jsonschema [] {envbindir}/py.test
coverage report --show-missing
coverage html
deps =
{[testenv:notpy33]deps}
{[testenv:py33]deps}
coverage
[testenv:docs]
basepython = python
changedir = docs
deps =
lxml
sphinx
commands =
sphinx-build [] -W -b html -d {envtmpdir}/doctrees . {envtmpdir}/html
[testenv:style]
deps = flake8
commands =
flake8 [] --max-complexity 10 jsonschema
[testenv:py26]
deps =
{[testenv:notpy33]deps}
{[testenv:all]deps}
argparse
unittest2
[testenv:py33]
commands =
py.test [] -s jsonschema
{envpython} -m doctest README.rst
sphinx-build -b doctest docs {envtmpdir}/html
deps =
{[testenv:all]deps}
{[testenv:notpy26]deps}
[testenv:notpy33]
deps =
mock
[testenv:notpy26]
deps =
rfc3987
[testenv:all]
deps =
lxml
pytest
sphinx
strict-rfc3339
webcolors
[flake8]
ignore = E203,E302,E303,E701,F811
[pytest]
addopts = -r s