Compare commits

..

92 Commits

Author SHA1 Message Date
Benjamin Dauvergne cd78af2623 base_adresse: add indexes on on geographic models names (#66694)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-28 16:49:40 +01:00
Frédéric Péters b9939892b8 opengis: include data from namespaced elements (#79982)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-27 13:41:47 +01:00
Benjamin Dauvergne 7de7cd8b3f cmis: produce more precise APIError on cmislib.PermissionDeniedException (#83682)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-27 10:31:35 +01:00
Benjamin Dauvergne c247197c6e cmis: search first existing parent folder from the end of the path (#83682) 2023-11-27 10:31:35 +01:00
Benjamin Dauvergne bfd1fcc2f6 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 21:25:20 +01:00
Nicolas Roche 905e3b141f toulouse-maelis: hide invoices no more returned by maelis (#83676)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 21:02:16 +01:00
Nicolas Roche e8122d29eb toulouse-maelis: allow to link with RL2 data (#83842)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 19:03:53 +01:00
Serghei Mihai 4c5204bd2f mdel: cleanup demands folders (#83570)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-16 11:54:35 +01:00
Corentin Sechet 07619bc012 qrcode: fix fullscreen size on desktop & icons (#83517)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 13:19:18 +01:00
Corentin Sechet b516b7b66c qrcode: fix warnings on the reader page (#83525)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 13:19:09 +01:00
Corentin Sechet 92768f5852 qrcode: don't show error in console if camera torch isn't supported (#83516)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-15 13:18:58 +01:00
Corentin Sechet d69e4df328 toulouse-foederis: add degree level parameter to attach_degree endpoint (#83507)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 11:39:05 +01:00
Corentin Sechet 2711f5c615 toulouse-foederis: add count parameters when updating announces (#83507) 2023-11-15 11:39:05 +01:00
Benjamin Dauvergne cd08a2068c qrcode: make validity optional (#83464)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 20:29:46 +01:00
Benjamin Dauvergne 66e99362ef qrcode: factorize generation of b45 signed data (#83464) 2023-11-14 20:29:46 +01:00
Benjamin Dauvergne 00443f8629 qrcode: change lost signature key in JS tests (#83464) 2023-11-14 20:29:46 +01:00
Corentin Sechet 230d424571 isere-esrh: add job-types endpoint (#82880)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 16:36:56 +01:00
Corentin Sechet a8e2223c50 isere-esrh: add entities endpoint (#82880) 2023-11-14 16:36:56 +01:00
Corentin Sechet 7951510aa1 isere-esrh: add connector and official endpoint (#82880) 2023-11-14 16:36:56 +01:00
Thomas NOËL 08fa0fad21 soap: complete check status system (#83473)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 14:35:54 +01:00
Benjamin Dauvergne 320013ac68 qrcode: add creation and last modification datetime to data models (#83463)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 22:08:07 +01:00
Benjamin Dauvergne 4338ee9cd7 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 18:42:38 +01:00
Corentin Sechet 32d3dd01bc qrcode: add frontend qrcode reader (#82652)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 16:57:06 +01:00
Corentin Sechet 7314fa224c js: add js unit tests support (#82651)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 16:19:51 +01:00
Corentin Sechet 82e9018865 qrcode: add qrcode reader management endpoints (#82650)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 15:31:35 +01:00
Corentin Sechet a51c49a865 qrcode: add get-qrcode endpoint (#82649)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 14:55:32 +01:00
Corentin Sechet 1e12dae71b qrcode: create qrcode connector & certificate management endpoints (#82648)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 14:34:31 +01:00
Thomas NOËL e506facfd6 debian: add back memory-report to uwsgi default configuration (#80451)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 11:34:46 +01:00
Thomas NOËL 0a28034137 astech: don't use bytes in APIError content (#83342)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 15:32:21 +01:00
Nicolas Roche 81f58cad59 toulouse-maelis: filter subscribable school years (#83262)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 12:11:28 +01:00
Emmanuel Cazenave 2a73e4dfb3 general: log Publik-Caller header (#83111)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 11:00:14 +01:00
Nicolas Roche 979e531b3a toulouse-maelis: correct for-payment parameter usage (#77110)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:41:34 +01:00
Nicolas Roche 5cd1e3aacc toulouse-maelis: get users linked to a family (#77295)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:38:06 +01:00
Nicolas Roche aa9585071a toulouse-maelis: manage online_payment without cache (#83257)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-10 10:37:35 +01:00
Nicolas Roche 923427783c toulouse-maelis: rename activity type label (#83165)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:12:43 +01:00
Nicolas Roche 34ac701200 toulouse-maelis: display default template values in manager (#83054)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:54:35 +01:00
Nicolas Roche 1b0c842d48 toulouse-maelis: manage applicative error on school registration (#82705)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:51:20 +01:00
Nicolas Roche d0f4b9ecf9 toulouse-maelis: [tests] add test on school registration error (#82705) 2023-11-10 09:51:20 +01:00
Nicolas Roche a7ff9bbc4a toulouse-maelis: put global catalog in cache (#82379)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-10 09:29:05 +01:00
Nicolas Roche b7b50717ca toulouse-maelis: disable connecteur by default (#83025)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:27:34 +01:00
Nicolas Roche 6c4fc4152d toulouse-maelis: remove unused ref_date on catalog endpoint (#82966)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 08:59:40 +01:00
Nicolas Roche e59765eaf7 toulouse-maelis: prevent creating invoice in concurency (#82706)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 08:43:17 +01:00
Thomas NOËL 8bb8f2c1df translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 15:33:33 +01:00
Thomas NOËL e2a45ea01b matrix42: rename search_filter to filter (#83112)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 15:06:41 +01:00
Thomas NOËL 11d3bd5a9b translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 13:39:27 +01:00
Thomas NOËL 2162e9d08d matrix42: use a pattern for ddname on fragment endpoint (#83105)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-03 13:12:47 +01:00
Thomas NOËL 264550e363 matrix42: add filter possibility on fragment endpoint (#83103)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 12:10:00 +01:00
Thomas NOËL ba58f183ed matrix42: do not log requests errors (#83101)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 11:38:02 +01:00
Thomas NOËL f564e71d5d translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 17:00:01 +01:00
Thomas NOËL 6f7acc1489 add matrix42 connector (#81490)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 15:59:03 +01:00
Serghei Mihai 62c0b91ac4 astech: don't log requests errors (#83034)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 12:16:23 +01:00
Emmanuel Cazenave 2b0842eb03 atal_rest: accept empty list in worksrequest-intervention-status (#83029)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 10:57:25 +01:00
Emmanuel Cazenave d315580294 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 10:43:59 +01:00
Serghei Mihai 140863373f translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 16:03:12 +01:00
Serghei Mihai fa50ff9129 astech: add filters to column results (#82963)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 15:55:16 +01:00
Emmanuel Cazenave 0154defcce translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 15:04:24 +01:00
Emmanuel Cazenave f336d7a952 atal_rest: add worksrequest-intervention-status endpoint (#82948)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 14:58:31 +01:00
Thomas NOËL c148f6ae03 debian: add uwsgi/passerelle SyslogIdentifier in service (#82977)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 13:21:13 +01:00
Nicolas Roche bda1eba253 cron: add an every5min entry (#82961)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-30 18:02:50 +01:00
Emmanuel Cazenave 8892a97435 setup: compute pep440 compliant dirty version number (#81731)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-30 17:36:44 +01:00
Serghei Mihai 14a6fb1aed translations update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-19 10:11:49 +02:00
Benjamin Dauvergne f63e250e0d utils: add tools to execute actions outisde of transactions (#31204)
gitea/passerelle/pipeline/head This commit looks good Details
First use of it is to create ResourceLog objects after the current
transaction commit or abort.
2023-10-19 09:59:10 +02:00
Serghei Mihai 4789f1e1ff astech: add endpoint to get view data (#82416)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-17 16:41:42 +02:00
Serghei Mihai 2bbc835787 astech: add endpoint to get view columns (#82416) 2023-10-16 15:34:15 +02:00
Serghei Mihai 94184d9c5e astech: add endpoint to list available views (#82416) 2023-10-16 15:33:38 +02:00
Corentin Sechet 76f3860ad2 toulouse-foederis: allow empty degree files (#82390)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-16 11:29:49 +02:00
Corentin Sechet 3d5ec0268c toulouse-foederis: don't send filename in Content-Disposition of form-data when posting attachments (#82346)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-16 10:59:18 +02:00
Nicolas Roche f2652bac36 toulouse-smart: do not crash on receiving string in place of block field (#79816)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 15:08:55 +02:00
Corentin Sechet c598673e3d translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 15:05:52 +02:00
Corentin Sechet fe1f40cc7d toulouse-foederis: remove type_emploi data source (#82294)
gitea/passerelle/pipeline/head Build queued... Details
2023-10-12 14:56:48 +02:00
Corentin Sechet a3db9b1e35 toulouse-foederis: use multipart/form-data to attach files & diplomas (#82291)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 11:18:56 +02:00
Thomas NOËL 92f5b5f26b update translations
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-10-03 10:19:19 +02:00
Frédéric Péters d6b87039cb ci: keep on using pylint 2 while pylint-django is not ready (#81905)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-03 08:15:23 +02:00
Corentin Sechet 9d67f8587a toulouse-foederis: allow empty phone in create-application (#81728)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-02 10:28:48 +02:00
Benjamin Dauvergne a9f2956db7 templatags: rendering of $id/$ref in jsonschema (#81643)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-02 10:08:31 +02:00
Benjamin Dauvergne 8266740b52 soap: handle recursive complexType (#81643)
Reference to already converted complexType are converted to JSON schema
references.
2023-10-02 10:08:31 +02:00
Lauréline Guérin 117743e0a6 caluire-axel: fix family_date with a None first name (#81673)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-01 16:54:03 +02:00
Nicolas Roche 49226aca44 toulouse-maelis: add service filter on get_nursery_geojson (#81538)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-29 09:09:12 +02:00
Nicolas Roche bc62bdc3fd toulouse-maelis: [tests] return more nurseries on get_nursery_geojson (#81538) 2023-09-29 09:09:12 +02:00
Nicolas Roche 4bd7032998 toulouse-maelis: add service filter on read_nursery_list (#81538) 2023-09-29 09:09:12 +02:00
Nicolas Roche e1b3ab7646 toulouse-maelis: [tests] get last wsdls (#81538) 2023-09-29 09:09:12 +02:00
Frédéric Péters 7a671f7e74 misc: introduce setting to disable https checks for given hostnames (#81541)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-29 07:39:26 +02:00
Nicolas Roche 441ac49c58 toulouse-maelis: [tools] connector benchmark script (#81399)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 16:43:02 +02:00
Nicolas Roche bac28e933c toulouse-maelis: [tools] add tests to soap benchmark (#81399) 2023-09-25 16:43:02 +02:00
Benjamin Dauvergne b497988bf5 toulouse-maelis: [tools] read_family benchmark script (#81399) 2023-09-25 16:43:02 +02:00
Emmanuel Cazenave 898a14f821 atal_rest: accept empty files (#81518)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 14:38:33 +02:00
Emmanuel Cazenave ef0b518aba sne: fail silently on common sne errors (#81452)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 11:56:48 +02:00
Emmanuel Cazenave bf2610b4c5 sne: fail silently on incorrectly formatted demand_id (#81452) 2023-09-25 11:56:48 +02:00
Paul Marillonnet c56c0676de tests: provide tox4 compatibility (#81548)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 09:29:19 +02:00
Corentin Sechet 649c1c05a8 toulouse-foederis: send offer id in good field when applying (#81476)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-22 10:08:56 +02:00
Benjamin Dauvergne a192a953b9 toulouse_maelis: cache soap_client for 5 minutes (#81418)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-20 18:47:37 +02:00
Benjamin Dauvergne faf3e4692e base: add cache to soap_client method (#81418) 2023-09-20 18:47:37 +02:00
93 changed files with 35423 additions and 599 deletions

6
.gitignore vendored
View File

@ -12,5 +12,7 @@ passerelle.egg-info/
coverage.xml
junit-py*.xml
.sass-cache/
passerelle/static/css/style.css
passerelle/static/css/style.css.map
passerelle/**/static/**/css/style.css
passerelle/**/static/**/css/style.css.map
node_modules/
coverage/

15
README
View File

@ -126,3 +126,18 @@ django-jsonresponse (https://github.com/jjay/django-jsonresponse)
# Files: passerelle/utils/jsonresponse.py
# Copyright (c) 2012 Yasha Borevich <j.borevich@gmail.com>
# Licensed under the BSD license
tweetnacl-js (https://github.com/dchest/tweetnacl-js)
# Files: passerelle/apps/qrcode/static/qrcode/js/nacl.min.js
# Copyright: https://github.com/dchest/tweetnacl-js/blob/master/AUTHORS.md
# Licensed under the Unlicense license (public domain)
zxing-browser (https://github.com/zxing-js/browser/)
# Files: passerelle/apps/qrcode/static/qrcode/js/zxing-browser.min.js
# Copyright: (c) 2018 ZXing for JS
# Licensed under the MIT license.
RemixIcon (https://github.com/Remix-Design/RemixIcon)
# Files: passerelle/apps/qrcode/static/qrcode/img/favicon.ico
# Copyright (c) 2020 RemixIcon.com
# Licensed under the Apache License Version 2.0

View File

@ -4,6 +4,7 @@ After=network.target postgresql.service
Wants=postgresql.service
[Service]
SyslogIdentifier=uwsgi/%p
Environment=PASSERELLE_SETTINGS_FILE=/usr/lib/%p/debian_config.py
Environment=PASSERELLE_WSGI_TIMEOUT=120
Environment=PASSERELLE_WSGI_WORKERS=5

2
debian/uwsgi.ini vendored
View File

@ -18,6 +18,7 @@ spooler-python-import = passerelle.utils.spooler
spooler-max-tasks = 20
# every five minutes
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants every5min
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants availability
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants jobs
# hourly
@ -56,6 +57,7 @@ buffer-size = 32768
py-tracebacker = /run/passerelle/py-tracebacker.sock.
stats = /run/passerelle/stats.sock
memory-report = true
ignore-sigpipe = true
disable-write-exception = true

View File

@ -147,6 +147,8 @@ class ASTech(BaseResource, HTTPResource):
_category_ordering = [_('Parameters'), _('Rules'), _('Demand'), 'Tech & Debug']
log_requests_errors = False
class Meta:
verbose_name = _('AS-TECH')
@ -159,7 +161,7 @@ class ASTech(BaseResource, HTTPResource):
try:
content = response.json()
except ValueError:
content = response.content[:1024]
content = '%r' % response.content[:1024]
raise APIError(
'AS-TECH response: %s %s' % (response.status_code, response.reason),
data={
@ -220,6 +222,48 @@ class ASTech(BaseResource, HTTPResource):
json_response = self.call_json(method, url, params=params, **kwargs)
return json_response
def get_view_schema(self, view_code):
cache_key = 'astech-%s-%s-schema' % (self.id, view_code)
schema = cache.get(cache_key)
if schema:
return schema
endpoint = 'apicli/data/%s/columns' % view_code
columns = self.call(endpoint).get('columns', [])
schema = {}
for column in columns:
column.pop('des')
code = column.pop('code')
if column['type'] == 'NUM':
column['operator'] = '='
else:
column['operator'] = 'is_equal'
schema[code] = column
cache.set(cache_key, schema)
return schema
def build_view_filters(self, view_code, filters):
if not filters:
return []
schema = self.get_view_schema(view_code)
filters_expression = []
for expression in filters.split(';'):
try:
name, value = expression.split('=')
except ValueError:
continue
if value and schema[name]['length'] and len(value) > int(schema[name]['length']):
raise APIError(
_('Value of %s exceeds authorized length (%s)') % (name, schema[name]['length'])
)
filters_expression.append(
{
'field': name,
'type': schema[name]['type'],
'filter': {'value': value, 'operator': schema[name]['operator']},
}
)
return filters_expression
@endpoint(
name='connections',
description=_('See all possible connections codes (see configuration)'),
@ -440,3 +484,69 @@ class ASTech(BaseResource, HTTPResource):
position['id'] = position['position']
position['text'] = position['positionLib']
return {'data': positions}
@endpoint(
name='list-views',
display_order=1,
description=_('List available views'),
display_category=_('Referential'),
)
def list_views(self, request):
results = self.call('apicli/data/views')
astech_views = results.get('views', [])
for view in astech_views:
view['id'] = view['apivId']
view['text'] = view['apivNom']
return {'data': astech_views}
@endpoint(
name='get-view-columns',
display_order=2,
description=_('Get view columns'),
display_category=_('Referential'),
parameters={
'code': {
'description': _('View code'),
'example_value': 'ASTECH_BIENS',
},
},
)
def get_view_columns(self, request, code):
endpoint = 'apicli/data/%s/columns' % code
results = self.call(endpoint)
columns = results.get('columns', [])
for column in columns:
column['id'] = column['code']
column['text'] = column['des']
return {'data': columns}
@endpoint(
name='get-view-data',
display_order=3,
description=_('Get view data'),
display_category=_('Referential'),
datasource=True,
parameters={
'code': {
'description': _('View code'),
'example_value': 'ASTECH_BIENS',
},
'id_column': {'description': _('Name of column contaning the id'), 'example_value': 'BIEN_ID'},
'text_column': {
'description': _('Name of column contaning the label'),
'example_value': 'DESIGNATION',
},
'filters': {
'description': _('Semicolon separated filter expressions'),
'example_value': 'GENRE=SIT;SECTEUR=S1',
},
},
)
def get_view_data(self, request, code, id_column, text_column, filters=None):
endpoint = 'apicli/data/%s/results' % code
filters = self.build_view_filters(code, filters)
results = self.call(endpoint, json={'data': {'filters': filters}})
for result in results:
result['id'] = result[id_column]
result['text'] = result[text_column]
return {'data': results}

View File

@ -30,29 +30,38 @@ from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
FILE_OBJECT = {
'type': 'object',
'description': 'File object',
'required': ['content'],
'properties': {
'filename': {
'type': 'string',
'description': 'Filename',
},
'content': {
'type': 'string',
'description': 'Content',
},
'content_type': {
'type': 'string',
'description': 'Content type',
},
},
}
SINGLE_ATTACHMENT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'additionalProperties': False,
'properties': {
'file': {
'type': 'object',
'properties': {
'filename': {
'type': 'string',
'description': 'Filename',
},
'content': {
'type': 'string',
'description': 'Content',
},
'content_type': {
'type': 'string',
'description': 'Content type',
},
},
'required': ['content'],
},
'oneOf': [
FILE_OBJECT,
{'type': 'string', 'description': 'empty file, do not consider', 'pattern': r'^$'},
{'type': 'null', 'description': 'empty file, do not consider'},
]
}
},
'required': ['file'],
}
@ -66,22 +75,11 @@ ATTACHMENTS_SCHEMA = {
'files': {
'type': 'array',
'items': {
'type': 'object',
'properties': {
'filename': {
'type': 'string',
'description': 'Filename',
},
'content': {
'type': 'string',
'description': 'Content',
},
'content_type': {
'type': 'string',
'description': 'Content type',
},
},
'required': ['content'],
'oneOf': [
FILE_OBJECT,
{'type': 'string', 'description': 'empty file, do not consider', 'pattern': r'^$'},
{'type': 'null', 'description': 'empty file, do not consider'},
]
},
},
'worksrequests_ids': {'type': 'array', 'items': {'type': 'string'}},
@ -154,6 +152,14 @@ STATUS_MAP = {
}
INTERVENTION_STATUS_MAP = {
1: 'Pas commencé',
2: 'En cours',
4: 'Terminé',
5: 'Fermé',
}
def to_ds(record):
record['id'] = record['Id']
record['text'] = record['Name']
@ -379,6 +385,8 @@ class AtalREST(BaseResource, HTTPResource):
},
)
def worksrequests_single_attachment(self, request, worksrequests_id, post_data):
if not post_data['file']:
return {}
try:
content = base64.b64decode(post_data['file']['content'])
except (TypeError, binascii.Error):
@ -428,6 +436,8 @@ class AtalREST(BaseResource, HTTPResource):
def worksrequests_attachments(self, request, post_data):
files = []
for file_ in post_data.get('files', []):
if not file_:
continue
try:
content = base64.b64decode(file_['content'])
except (TypeError, binascii.Error):
@ -442,6 +452,8 @@ class AtalREST(BaseResource, HTTPResource):
),
)
)
if not files:
return {}
data = {'Ids': post_data['worksrequests_ids']}
# return nothing if successful
self._call(
@ -467,3 +479,19 @@ class AtalREST(BaseResource, HTTPResource):
resp_data = self._call('api/WorksRequests/%s' % worksrequests_id)
resp_data['RequestStateLabel'] = STATUS_MAP.get(resp_data.get('RequestState', ''), '')
return {'data': resp_data}
@endpoint(
methods=['get'],
name='worksrequest-intervention-status',
description=_('Get the status of a works request intervention'),
parameters={
'number': {
'example_value': 'DIT23070011',
}
},
)
def worksrequest_intervention_status(self, request, number):
resp_data = self._call('/api/WorksRequests/GetInterventionStates', params={'number': number})
resp_data = resp_data[0] if resp_data else {}
resp_data['WorkStateLabel'] = INTERVENTION_STATUS_MAP.get(resp_data.get('WorkState', ''), '')
return {'data': resp_data}

View File

@ -248,27 +248,34 @@ class CMISGateway:
def repo(self):
return self._cmis_client.defaultRepository
def _get_or_create_folder(self, file_path):
try:
self._logger.debug("searching '%s'" % file_path)
res = self.repo.getObjectByPath(file_path)
self._logger.debug("'%s' found" % file_path)
return res
except ObjectNotFoundException:
self._logger.debug("'%s' not found" % file_path)
basepath = ''
folder = self.repo.rootFolder
for path_part in file_path.strip('/').split('/'):
basepath += '/%s' % path_part
try:
self._logger.debug("searching '%s'" % basepath)
folder = self.repo.getObjectByPath(basepath)
self._logger.debug("'%s' found" % basepath)
except ObjectNotFoundException:
self._logger.debug("'%s' not found" % basepath)
folder = folder.createFolder(path_part)
self._logger.debug("create folder '%s'" % basepath)
return folder
def _get_or_create_folder(self, path):
parts = path.strip('/').split('/')
base = 0
for i in range(len(parts), 0, -1):
parent_path = '/' + '/'.join(parts[:i])
try:
folder = self.repo.getObjectByPath(parent_path)
base = i
break
except ObjectNotFoundException:
pass
except PermissionDeniedException:
raise APIError('CMIS server denied reading folder %s' % parent_path)
else:
try:
folder = self.repo.root_folder
except PermissionDeniedException:
raise APIError('CMIS server denied reading folder /')
for i in range(base, len(parts)):
try:
folder = folder.createFolder(parts[i])
except PermissionDeniedException:
parent_path = '/' + '/'.join(parts[: i + 1])
raise APIError('CMIS server denied creating folder %s' % parent_path)
return folder
@wrap_cmis_error
def create_doc(

View File

View File

@ -0,0 +1,76 @@
# Generated by Django 3.2.18 on 2023-09-22 13:03
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='Matrix42',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'basic_auth_username',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication username'
),
),
(
'basic_auth_password',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication password'
),
),
(
'client_certificate',
models.FileField(
blank=True, null=True, upload_to='', verbose_name='TLS client certificate'
),
),
(
'trusted_certificate_authorities',
models.FileField(blank=True, null=True, upload_to='', verbose_name='TLS trusted CAs'),
),
(
'verify_cert',
models.BooleanField(blank=True, default=True, verbose_name='TLS verify certificates'),
),
(
'http_proxy',
models.CharField(blank=True, max_length=128, verbose_name='HTTP and HTTPS proxy'),
),
(
'base_url',
models.URLField(
help_text='Example: https://xxx.m42cloud.com/m42Services/api/',
verbose_name='Webservice Base URL',
),
),
('token', models.CharField(max_length=512, verbose_name='Authorization Token')),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_matrix42_matrix42_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'Matrix42 Public API',
},
),
]

View File

@ -0,0 +1,212 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from urllib.parse import urljoin
from django.db import models
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
from passerelle.utils.templates import render_to_string
DICT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'additionalProperties': True,
'unflatten': True,
}
class Matrix42(BaseResource, HTTPResource):
category = _('Business Process Connectors')
log_requests_errors = False
class Meta:
verbose_name = _('Matrix42 Public API')
base_url = models.URLField(
_('Webservice Base URL'), help_text=_('Example: https://xxx.m42cloud.com/m42Services/api/')
)
token = models.CharField(max_length=512, verbose_name=_('Authorization Token'))
def get_authorization_headers(self):
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + self.token,
}
token = self.request('ApiToken/GenerateAccessTokenFromApiToken', headers=headers, method='POST')
if 'RawToken' not in token:
raise APIError('Matrix42 not returned a RawToken: %s' % token)
return {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + token['RawToken'],
}
def request(self, uri, params=None, json=None, headers=None, method=None, dict_response=True):
if headers is None:
headers = self.get_authorization_headers()
if method is None:
method = 'GET' if json is None else 'POST'
url = urljoin(self.base_url, uri)
response = self.requests.request(method, url, params=params, json=json, headers=headers)
status_code = response.status_code
try:
response = response.json()
except ValueError:
raise APIError(
'Matrix42 returned %s response with invalid JSON content: %r'
% (status_code, response.content)
)
if dict_response:
if not isinstance(response, dict):
raise APIError(
'Matrix42 returned %s response, not returned a dict: %r' % (status_code, response),
data=response,
)
if isinstance(response, dict) and 'ExceptionName' in response:
message = response.get('Message') or '(no message)'
raise APIError(
'Matrix42 returned %s response, ExceptionName "%s": %s'
% (status_code, response['ExceptionName'], message),
data=response,
)
if status_code // 100 != 2:
raise APIError('Matrix42 returned status code %s' % status_code, data=response)
return response
@endpoint(
name='fragment',
pattern=r'^(?P<ddname>.+)$',
example_pattern='SPSUserClassBase',
description=_('Fragment Query'),
display_category=_('Fragments'),
parameters={
'ddname': {
'description': _('Technical name of the Data Definition'),
},
'columns': {
'description': _('Columns in the result set, separated by comma'),
'example_value': 'ID,[Expression-ObjectID] as EOID,LastName,FirstName,MailAddress',
},
'filter': {
'description': _('Filter: "WHERE filter"'),
},
'template': {
'description': _(
'Django template for text attribute - if none, use DisplayString|DisplayName|Name'
),
'example_value': '{{ FirstName }} {{ LastName }} ({{ MailAddress }})',
},
'id_template': {
'description': _('Django template for id attribute - if none, use ID'),
'example_value': '{{ ID }}',
},
'search_column': {
'description': _('Search column: "WHERE search_column LIKE \'%q%\' (AND filter)"'),
},
'q': {'description': _('Search text (needs a search_column)')},
'id': {'description': _('Get the whole fragment with this ID')},
},
)
def fragment(
self,
request,
ddname,
columns=None,
filter=None,
template=None,
id_template=None,
search_column=None,
q=None,
id=None,
):
def add_id_and_text(result):
if id_template:
result['id'] = render_to_string(id_template, result)
else:
result['id'] = result.get('ID')
if template:
result['text'] = render_to_string(template, result)
else:
result['text'] = (
result.get('DisplayString') or result.get('DisplayName') or result.get('Name') or ''
)
if id:
uri = 'data/fragments/%s/%s' % (ddname, id)
result = self.request(uri)
add_id_and_text(result)
return {'data': [result]}
if q is not None and not search_column:
raise APIError('q needs a search_column parameter', http_status=400)
uri = urljoin(self.base_url, 'data/fragments/%s/schema-info' % ddname)
params = {}
if columns:
params['columns'] = columns
if q is not None:
params['where'] = "%s LIKE '%%%s%%'" % (search_column, q.replace("'", "''"))
if filter:
params['where'] += ' AND %s' % filter
elif filter:
params['where'] = filter
results = self.request(uri, params=params).get('Result') or []
for result in results:
add_id_and_text(result)
return {'data': results}
@endpoint(
name='get-object',
description=_('Get an object'),
display_category=_('Objects'),
methods=['get'],
pattern=r'^(?P<ciname>.+)/(?P<object_id>.+)$',
example_pattern='SPSActivityTypeTicket/01b02f7d-adb6-49e6-aae3-66251ecbf98e',
)
def get_object(
self,
request,
ciname,
object_id,
):
uri = urljoin(self.base_url, 'data/objects/%s/%s' % (ciname, object_id))
return {'data': self.request(uri)}
@endpoint(
name='create-object',
display_category=_('Objects'),
methods=['post'],
pattern=r'^(?P<ciname>.+)$',
example_pattern='SPSActivityTypeTicket',
post={
'description': _('Create an new object'),
'request_body': {'schema': {'application/json': DICT_SCHEMA}},
},
)
def create_object(
self,
request,
ciname,
post_data,
):
uri = urljoin(self.base_url, 'data/objects/%s' % ciname)
return {'data': self.request(uri, json=post_data, dict_response=False)}

View File

@ -16,6 +16,7 @@
import json
import os
import shutil
import zipfile
from django.db import models
@ -191,6 +192,11 @@ class MDEL(BaseResource):
if self.incoming_sftp:
self.get_response_files()
def daily(self):
super().daily()
# cleanup demands folders
Demand.cleanup()
class Demand(models.Model):
created_at = models.DateTimeField(auto_now_add=True)
@ -447,3 +453,11 @@ class Demand(models.Model):
self.save()
return result
@classmethod
def cleanup(cls):
for instance in cls.objects.all():
dirname = os.path.join(instance.resource.input_dir, instance.name)
if not os.path.exists(dirname):
continue
shutil.rmtree(dirname, ignore_errors=True)

View File

@ -19,6 +19,7 @@ import math
import xml.etree.ElementTree as ET
import pyproj
from django.conf import settings
from django.core.cache import cache
from django.db import models, transaction
from django.db.models import JSONField, Q
@ -40,8 +41,12 @@ def build_dict_from_xml(elem):
d = {}
for child in elem.find('.'):
if child.tag.startswith('{'):
continue
attribute_name = slugify(child.tag).replace('-', '_')
if child.tag.split('}')[0][1:] in settings.OPENGIS_SKIPPED_NAMESPACES:
continue
attribute_name = child.tag.split('}')[1]
else:
attribute_name = child.tag
attribute_name = slugify(attribute_name).replace('-', '_')
if child.text and child.text.strip():
d[attribute_name] = child.text.strip()
else:

View File

@ -0,0 +1,15 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.xs

View File

@ -0,0 +1,78 @@
# Generated by Django 3.2.18 on 2023-11-02 09:29
import uuid
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
import passerelle.apps.qrcode.models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='QRCodeConnector',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'key',
models.CharField(
default=passerelle.apps.qrcode.models.generate_key,
max_length=64,
validators=[
django.core.validators.RegexValidator(
'[a-z|0-9]{64}', 'Key should be a 32 bytes hexadecimal string'
)
],
verbose_name='Private Key',
),
),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_qrcode_qrcodeconnector_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'QR Code',
},
),
migrations.CreateModel(
name='Certificate',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('uuid', models.UUIDField(default=uuid.uuid4, unique=True, verbose_name='UUID')),
('validity_start', models.DateTimeField(verbose_name='Validity Start Date')),
('validity_end', models.DateTimeField(verbose_name='Validity End Date')),
('data', models.JSONField(null=True, verbose_name='Certificate Data')),
(
'resource',
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name='certificates',
to='qrcode.qrcodeconnector',
),
),
],
),
]

View File

@ -0,0 +1,35 @@
# Generated by Django 3.2.18 on 2023-11-02 09:31
import uuid
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('qrcode', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Reader',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('uuid', models.UUIDField(default=uuid.uuid4, unique=True, verbose_name='UUID')),
('validity_start', models.DateTimeField(verbose_name='Validity Start Date')),
('validity_end', models.DateTimeField(verbose_name='Validity End Date')),
(
'resource',
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name='readers',
to='qrcode.qrcodeconnector',
),
),
],
),
]

View File

@ -0,0 +1,39 @@
# Generated by Django 3.2.18 on 2023-11-13 21:07
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('qrcode', '0002_reader'),
]
operations = [
migrations.AddField(
model_name='certificate',
name='created',
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now, verbose_name='Created'
),
preserve_default=False,
),
migrations.AddField(
model_name='certificate',
name='modified',
field=models.DateTimeField(auto_now=True, verbose_name='Last modification'),
),
migrations.AddField(
model_name='reader',
name='created',
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now, verbose_name='Created'
),
preserve_default=False,
),
migrations.AddField(
model_name='reader',
name='modified',
field=models.DateTimeField(auto_now=True, verbose_name='Last modification'),
),
]

View File

@ -0,0 +1,32 @@
# Generated by Django 3.2.18 on 2023-11-13 21:31
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('qrcode', '0003_created_modified'),
]
operations = [
migrations.AlterField(
model_name='certificate',
name='validity_end',
field=models.DateTimeField(null=True, verbose_name='Validity End Date'),
),
migrations.AlterField(
model_name='certificate',
name='validity_start',
field=models.DateTimeField(null=True, verbose_name='Validity Start Date'),
),
migrations.AlterField(
model_name='reader',
name='validity_end',
field=models.DateTimeField(null=True, verbose_name='Validity End Date'),
),
migrations.AlterField(
model_name='reader',
name='validity_start',
field=models.DateTimeField(null=True, verbose_name='Validity Start Date'),
),
]

View File

@ -0,0 +1,358 @@
import binascii
import os
import uuid
from datetime import datetime, timezone
from io import BytesIO
from django.core.validators import RegexValidator
from django.db import models
from django.http import HttpResponse
from django.shortcuts import get_object_or_404
from django.template.response import TemplateResponse
from django.urls import reverse
from django.utils.dateparse import parse_datetime
from django.utils.translation import gettext_lazy as _
from nacl.signing import SigningKey
from qrcode import ERROR_CORRECT_Q, QRCode
from qrcode.image.pil import PilImage
from passerelle.base.models import BaseResource
from passerelle.utils.api import endpoint
CERTIFICATE_SCHEMA = {
'$schema': 'http://json-schema.org/draft-06/schema#',
'type': 'object',
'unflatten': True,
'additionalProperties': False,
'properties': {
'data': {
'type': 'object',
'title': _('Data to encode in the certificate'),
'additionalProperties': {'type': 'string'},
},
'validity_start': {
'any': [{'type': 'null'}, {'const': ''}, {'type': 'string', 'format': 'date-time'}],
},
'validity_end': {
'any': [{'type': 'null'}, {'const': ''}, {'type': 'string', 'format': 'date-time'}],
},
},
}
READER_SCHEMA = {
'$schema': 'http://json-schema.org/draft-06/schema#',
'type': 'object',
'additionalProperties': False,
'properties': {
'validity_start': {
'any': [{'type': 'null'}, {'const': ''}, {'type': 'string', 'format': 'date-time'}],
},
'validity_end': {
'any': [{'type': 'null'}, {'const': ''}, {'type': 'string', 'format': 'date-time'}],
},
},
}
def generate_key():
key = os.urandom(32)
return ''.join(format(x, '02x') for x in key)
UUID_PATTERN = '(?P<uuid>[0-9|a-f]{8}-[0-9|a-f]{4}-[0-9|a-f]{4}-[0-9|a-f]{4}-[0-9a-f]{12})'
class QRCodeConnector(BaseResource):
category = _('Misc')
key = models.CharField(
_('Private Key'),
max_length=64,
default=generate_key,
validators=[RegexValidator(r'[a-z|0-9]{64}', 'Key should be a 32 bytes hexadecimal string')],
)
class Meta:
verbose_name = _('QR Code')
@property
def signing_key(self):
binary_key = binascii.unhexlify(self.key)
return SigningKey(seed=binary_key)
@property
def hex_verify_key(self):
verify_key = self.signing_key.verify_key.encode()
return binascii.hexlify(verify_key).decode('utf-8')
@endpoint(
name='save-certificate',
pattern=f'^{UUID_PATTERN}?$',
example_pattern='{uuid}',
description=_('Create or update a certificate'),
post={'request_body': {'schema': {'application/json': CERTIFICATE_SCHEMA}}},
parameters={
'uuid': {
'description': _('Certificate identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def save_certificate(self, request, uuid=None, post_data=None):
if post_data.get('validity_start'):
validity_start = parse_datetime(post_data['validity_start'])
else:
validity_start = None
if post_data.get('validity_end'):
validity_end = parse_datetime(post_data['validity_end'])
else:
validity_end = None
data = post_data.get('data') or {}
if not uuid:
certificate = self.certificates.create(
data=data,
validity_start=validity_start,
validity_end=validity_end,
)
else:
certificate = get_object_or_404(self.certificates, uuid=uuid)
certificate.validity_start = validity_start
certificate.validity_end = validity_end
certificate.data = data
certificate.save()
return {
'data': {
'uuid': certificate.uuid,
'qrcode_url': certificate.get_qrcode_url(request),
}
}
@endpoint(
name='get-certificate',
description=_('Retrieve an existing certificate'),
pattern=f'^{UUID_PATTERN}$',
example_pattern='{uuid}',
parameters={
'uuid': {
'description': _('Certificate identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def get_certificate(self, request, uuid):
certificate = get_object_or_404(self.certificates, uuid=uuid)
return {
'err': 0,
'data': {
'uuid': certificate.uuid,
'data': certificate.data,
'validity_start': certificate.validity_start and certificate.validity_start.isoformat(),
'validity_end': certificate.validity_end and certificate.validity_end.isoformat(),
'qrcode_url': certificate.get_qrcode_url(request),
},
}
@endpoint(
name='get-qrcode',
description=_('Get QR Code'),
pattern=f'^{UUID_PATTERN}$',
example_pattern='{uuid}',
parameters={
'uuid': {
'description': _('Certificate identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def get_qrcode(self, request, uuid):
certificate = self.certificates.get(uuid=uuid)
qr_code = certificate.generate_qr_code()
return HttpResponse(qr_code, content_type='image/png')
@endpoint(
name='save-reader',
pattern=f'^{UUID_PATTERN}?$',
example_pattern='{uuid}',
description=_('Create or update a qrcode reader'),
post={'request_body': {'schema': {'application/json': READER_SCHEMA}}},
parameters={
'uuid': {
'description': _('QRCode reader identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def save_reader(self, request, uuid=None, post_data=None):
if post_data.get('validity_start'):
validity_start = parse_datetime(post_data['validity_start'])
else:
validity_start = None
if post_data.get('validity_end'):
validity_end = parse_datetime(post_data['validity_end'])
else:
validity_end = None
if not uuid:
reader = self.readers.create(
validity_start=validity_start,
validity_end=validity_end,
)
else:
reader = get_object_or_404(self.readers, uuid=uuid)
reader.validity_start = validity_start
reader.validity_end = validity_end
reader.save()
return {
'data': {
'uuid': reader.uuid,
'url': reader.get_url(request),
}
}
@endpoint(
name='get-reader',
description=_('Get informations about a QRCode reader'),
pattern=f'^{UUID_PATTERN}$',
example_pattern='{uuid}',
parameters={
'uuid': {
'description': _('QRCode reader identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def get_reader(self, request, uuid):
reader = get_object_or_404(self.readers, uuid=uuid)
return {
'err': 0,
'data': {
'uuid': reader.uuid,
'validity_start': reader.validity_start and reader.validity_start.isoformat(),
'validity_end': reader.validity_end and reader.validity_end.isoformat(),
'url': reader.get_url(request),
},
}
@endpoint(
name='open-reader',
perm='OPEN',
description=_('Open a QRCode reader page.'),
pattern=f'^{UUID_PATTERN}$',
example_pattern='{uuid}',
parameters={
'uuid': {
'description': _('QRCode reader identifier'),
'example_value': '12345678-1234-1234-1234-123456789012',
}
},
)
def open_reader(self, request, uuid):
reader = get_object_or_404(self.readers, uuid=uuid)
now = datetime.now(timezone.utc)
return TemplateResponse(
request,
'qrcode/qrcode-reader.html',
context={
'started': now >= reader.validity_start if reader.validity_start is not None else True,
'expired': now >= reader.validity_end if reader.validity_end is not None else False,
'verify_key': self.hex_verify_key,
'reader': reader,
},
)
def encode_mime_like(data):
msg = ''
for key, value in data.items():
msg += '%s: %s\n' % (key, value.replace('\n', '\n '))
return msg.encode()
BASE45_CHARSET = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ $%*+-./:'
BASE45_DICT = {v: i for i, v in enumerate(BASE45_CHARSET)}
def b45encode(buf: bytes) -> bytes:
"""Convert bytes to base45-encoded string"""
res = ''
buflen = len(buf)
for i in range(0, buflen & ~1, 2):
x = (buf[i] << 8) + buf[i + 1]
e, x = divmod(x, 45 * 45)
d, c = divmod(x, 45)
res += BASE45_CHARSET[c] + BASE45_CHARSET[d] + BASE45_CHARSET[e]
if buflen & 1:
d, c = divmod(buf[-1], 45)
res += BASE45_CHARSET[c] + BASE45_CHARSET[d]
return res.encode()
class Certificate(models.Model):
uuid = models.UUIDField(verbose_name=_('UUID'), unique=True, default=uuid.uuid4)
created = models.DateTimeField(_('Created'), auto_now_add=True)
modified = models.DateTimeField(verbose_name=_('Last modification'), auto_now=True)
validity_start = models.DateTimeField(verbose_name=_('Validity Start Date'), null=True)
validity_end = models.DateTimeField(verbose_name=_('Validity End Date'), null=True)
data = models.JSONField(null=True, verbose_name='Certificate Data')
resource = models.ForeignKey(QRCodeConnector, on_delete=models.CASCADE, related_name='certificates')
def to_json(self):
data = {'uuid': str(self.uuid)}
if self.validity_start:
data['validity_start'] = str(self.validity_start.timestamp())
if self.validity_end:
data['validity_end'] = str(self.validity_end.timestamp())
data |= self.data
return data
def generate_b45_data(self):
data = self.to_json()
msg = encode_mime_like(data)
signed = self.resource.signing_key.sign(msg)
return b45encode(signed).decode()
def generate_qr_code(self):
qr_code = QRCode(image_factory=PilImage, error_correction=ERROR_CORRECT_Q)
data = self.generate_b45_data()
qr_code.add_data(data)
qr_code.make(fit=True)
image = qr_code.make_image(fill_color='black', back_color='white')
fd = BytesIO()
image.save(fd)
return fd.getvalue()
def get_qrcode_url(self, request):
qrcode_relative_url = reverse(
'generic-endpoint',
kwargs={
'slug': self.resource.slug,
'connector': self.resource.get_connector_slug(),
'endpoint': 'get-qrcode',
'rest': str(self.uuid),
},
)
return request.build_absolute_uri(qrcode_relative_url)
class Reader(models.Model):
uuid = models.UUIDField(verbose_name=_('UUID'), unique=True, default=uuid.uuid4)
created = models.DateTimeField(_('Created'), auto_now_add=True)
modified = models.DateTimeField(verbose_name=_('Last modification'), auto_now=True)
validity_start = models.DateTimeField(verbose_name=_('Validity Start Date'), null=True)
validity_end = models.DateTimeField(verbose_name=_('Validity End Date'), null=True)
resource = models.ForeignKey(QRCodeConnector, on_delete=models.CASCADE, related_name='readers')
def get_url(self, request):
relative_url = reverse(
'generic-endpoint',
kwargs={
'slug': self.resource.slug,
'connector': self.resource.get_connector_slug(),
'endpoint': 'open-reader',
'rest': str(self.uuid),
},
)
return request.build_absolute_uri(relative_url)

View File

@ -0,0 +1,145 @@
$red: #C4381C;
$red-light: #FCECE8;
$green: #47752F;
$green-light: #F2F7EE;
$gray-light: #CECECE;
qrcode-reader {
position: absolute;
inset: 0;
display: grid;
}
.qrcode-reader {
&--video-wrapper {
grid-area: 1 / 1 / 2 / 2;
width: fit-content;
height: fit-content;
display: grid;
justify-self: center;
align-self: center;
}
&--video {
grid-area: 1 / 1 / 2 / 2;
justify-self: center;
align-self: center;
width: 100%;
height: auto;
max-height: 100%;
.fullscreen & {
height: 100vh;
}
}
&--fullscreen-button {
z-index: 1;
grid-area: 1 / 1 / 2 / 2;
width: 1.8rem;
height: 1.8rem;
align-self: end;
justify-self: end;
border: none;
fill: white;
margin: 5px;
}
.fullscreen &--enter-fullscreen-icon {
display: none;
}
&--exit-fullscreen-icon {
display: none;
.fullscreen & {
display: block;
}
}
&--popup {
grid-area: 1 / 1 / 2 / 2;
align-self: end;
justify-self: center;
z-index: 1;
--title-background: #{$green-light};
--title-color: #{$green};
border: 2px solid var(--title-color);
border-radius: 5px;
margin: 10px;
box-shadow: 0 0 10px 3px #ffffff;
background: white;
display: flex;
flex-direction: column;
justify-items: center;
&.error {
--title-background: #{$red-light};
--title-color: #{$red};
}
&.closed {
display: none;
}
}
&--popup-title {
font-size: 1.2rem;
font-weight: bold;
text-align: center;
border-top-right-radius: 5px;
border-top-left-radius: 5px;
padding: 5px;
color: var(--title-color);
background-color: var(--title-background);
}
&--popup-content {
margin: 10px;
}
&--validity {
color: var(--title-color);
font-weight: bold;
display: grid;
grid-template-columns: auto 1fr;
gap: 5px;
margin-bottom: 5px;
}
&--validity-label {
align-self: end;
}
&--data-items {
display: grid;
grid-template-columns: auto 1fr;
margin-top: 5px;
padding-top: 5px;
border-top: 1px solid var(--gray-light);
}
&--data-item-label {
font-weight: bold;
justify-self: end;
}
&--data-item-value {
display: flex;
flex-wrap: wrap;
}
&--close-popup-button {
margin: 5px 10px 10px 10px;
padding: 5px;
font-size: 1.2rem;
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.2 KiB

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,297 @@
import './nacl.min.js'
import './zxing-browser.min.js'
/* c8 ignore start */
// https://github.com/zxing-js/browser/issues/72
if (window.ZXingBrowser) {
const patchedMediaStreamIsTorchCompatible = window.ZXingBrowser.BrowserCodeReader.mediaStreamIsTorchCompatible
window.ZXingBrowser.BrowserCodeReader.mediaStreamIsTorchCompatible = (track) => {
return track.getCapabilities && patchedMediaStreamIsTorchCompatible(track)
}
}
/* c8 ignore stop */
const translations = (() => {
const i18nElement = window.document.getElementById('qrcode-reader-i18n')
if (i18nElement) {
return JSON.parse(i18nElement.innerHTML)
}
return {}
})()
function translate (key) { return translations[key] || key }
function template (innerHTML) {
const templateElement = document.createElement('template')
templateElement.innerHTML = innerHTML
return templateElement
}
const notSupportedTemplate = template(`<p>${translate('not_supported')}</p>`)
const readerTemplate = template(`
<div class="qrcode-reader--video-wrapper">
<video class="qrcode-reader--video"></video>
<div class="qrcode-reader--fullscreen-button">
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 24 24"
class="qrcode-reader--enter-fullscreen-icon">
<path d="M8 3V5H4V9H2V3H8ZM2 21V15H4V19H8V21H2ZM22 21H16V19H20V15H22V21ZM22 9H20V5H16V3H22V9Z"></path>
</svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" class="qrcode-reader--exit-fullscreen-icon">
<path d="M18 7H22V9H16V3H18V7ZM8 9H2V7H6V3H8V9ZM18 17V21H16V15H22V17H18ZM8 15V21H6V17H2V15H8Z"></path>
</svg>
</div>
</div>
<div class="qrcode-reader--popup closed">
<div class="qrcode-reader--popup-title"></div>
<div class="qrcode-reader--popup-content"></div>
<button class="qrcode-reader--close-popup-button">${translate('close')}</button>
</div>
`)
const validityTemplate = template(`
<div class="qrcode-reader--validity">
<div class="qrcode-reader--validity-label">${translate('from')} :</div>
<div>{validityStart}</div>
<div>${translate('to')} :</div>
<div>{validityEnd}</div>
</div>
`)
const dataTemplate = template(`
<div class="qrcode-reader--data-items"></div>
`)
const dataItemTemplate = template(`
<span class="qrcode-reader--data-item-label">{label}&nbsp;:&nbsp;</span>
<span class="qrcode-reader--data-item-value">{value}</span>
`)
function decodeMimeLike (value) {
const chunks = value.split('\n')
const data = {}
let k = null
let v = null
for (let i = 0; i < chunks.length; i++) {
const line = chunks[i]
if (line.startsWith(' ')) {
if (k !== null) {
v += '\n' + line.slice(1)
}
} else {
if (k !== null) {
data[k] = v
k = null
v = null
}
if (line.indexOf(': ') !== -1) {
const parts = line.split(': ', 2)
k = parts[0]
v = parts[1]
}
}
}
if (k !== null) {
data[k] = v
}
return data
}
function divmod (a, b) {
let remainder = a
let quotient = 0
if (a >= b) {
remainder = a % b
quotient = (a - remainder) / b
}
return [quotient, remainder]
}
const BASE45_CHARSET = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ $%*+-./:'
function decodeBase45 (str) {
const output = []
const buf = []
for (let i = 0, length = str.length; i < length; i++) {
const j = BASE45_CHARSET.indexOf(str[i])
if (j < 0) { throw new Error('Base45 decode: unknown character') }
buf.push(j)
}
for (let i = 0, length = buf.length; i < length; i += 3) {
const x = buf[i] + buf[i + 1] * 45
if (length - i >= 3) {
const [d, c] = divmod(x + buf[i + 2] * 45 * 45, 256)
output.push(d)
output.push(c)
} else {
output.push(x)
}
}
return new Uint8Array(output)
}
class QRCodeReader extends window.HTMLElement {
#popup
#popupContent
#popupTitle
constructor () {
super()
if (!this.#supported()) {
this.appendChild(notSupportedTemplate.content.cloneNode(true))
return
}
this.appendChild(readerTemplate.content.cloneNode(true))
this.#popup = this.querySelector('.qrcode-reader--popup')
this.#popupContent = this.querySelector('.qrcode-reader--popup-content')
this.#popupTitle = this.querySelector('.qrcode-reader--popup-title')
const closePopupButton = this.querySelector('.qrcode-reader--close-popup-button')
closePopupButton.addEventListener('click', () => {
this.#popup.classList.add('closed')
})
const fullScreenButton = this.querySelector('.qrcode-reader--fullscreen-button')
fullScreenButton.addEventListener('click', () => {
this.#toggleFullScreen()
})
this.addEventListener('fullscreenchange', () => {
this.#onFullScreenChanged()
})
}
connectedCallback () {
if (!this.#supported()) {
return
}
this.#startScan()
}
async #startScan () {
const codeReader = new window.ZXingBrowser.BrowserQRCodeReader()
const videoElement = this.querySelector('.qrcode-reader--video')
await codeReader.decodeFromVideoDevice(undefined, videoElement, (result) => {
if (result) {
this.#showResult(result.text)
}
})
}
get #verifyKey () {
const hexKey = this.getAttribute('verify-key')
return new Uint8Array(hexKey.match(/[\da-f]{2}/gi).map(h => parseInt(h, 16)))
}
#supported () {
return !!navigator.mediaDevices
}
#showResult (qrCodeContent) {
this.#popup.classList.remove('error')
this.#popup.classList.remove('closed')
let signed
try {
signed = decodeBase45(qrCodeContent)
} catch (error) {
this.#showError(translate('invalid_qrcode'))
return
}
const opened = window.nacl.sign.open(signed, this.#verifyKey)
if (opened == null) {
this.#showError(translate('invalid_signature'))
return
}
this.#popupContent.innerHTML = ''
const decoder = new TextDecoder('utf-8')
const decoded = decoder.decode(opened)
const data = decodeMimeLike(decoded)
delete data.uuid
const validityStart = data.validity_start && new Date(parseFloat(data.validity_start) * 1000)
delete data.validity_start
const validityEnd = data.validity_end && new Date(parseFloat(data.validity_end) * 1000)
delete data.validity_end
const now = new Date()
if (validityStart && now.getTime() < validityStart.getTime()) {
this.#popupTitle.innerText = translate('not_yet_valid')
this.#popup.classList.add('error')
} else if (validityEnd && now.getTime() > validityEnd.getTime()) {
this.#popupTitle.innerText = translate('expired')
this.#popup.classList.add('error')
} else {
this.#popupTitle.innerText = translate('valid')
}
const validityElement = validityTemplate.cloneNode(true)
if (validityStart) {
validityElement.innerHTML = validityElement.innerHTML.replace('{validityStart}', validityStart.toLocaleString())
} else {
validityElement.innerHTML = validityElement.innerHTML.replace('{validityStart}', translate('always'))
}
if (validityStart) {
validityElement.innerHTML = validityElement.innerHTML.replace('{validityEnd}', validityEnd.toLocaleString())
} else {
validityElement.innerHTML = validityElement.innerHTML.replace('{validityEnd}', translate('never'))
}
this.#popupContent.append(validityElement.content)
const dataElement = dataTemplate.cloneNode(true)
const dataItems = dataElement.content.querySelector('.qrcode-reader--data-items')
for (const [key, value] of Object.entries(data)) {
const dataItem = dataItemTemplate.cloneNode(true)
dataItem.innerHTML = dataItem.innerHTML.replace('{label}', key).replace('{value}', value)
dataItems.append(dataItem.content)
}
this.#popupContent.append(dataElement.content)
}
#showError (message) {
this.#popup.classList.remove('closed')
this.#popup.classList.add('error')
this.#popupTitle.innerText = translate('invalid_title')
this.#popupContent.innerText = message
}
#toggleFullScreen () {
if (document.fullscreenElement) {
document.exitFullscreen()
} else {
this.requestFullscreen()
}
}
#onFullScreenChanged () {
if (document.fullscreenElement === this) {
this.classList.add('fullscreen')
} else {
this.classList.remove('fullscreen')
}
}
}
window.customElements.define('qrcode-reader', QRCodeReader)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,38 @@
{% load i18n %}
{% load static %}
<!doctype html>
<html>
<head>
<link rel="icon" type="image/x-icon" href="{% static 'qrcode/img/favicon.ico' %}">
<meta name="viewport" content="width=device-width, initial-scale=1">
<script type="application/json" id="qrcode-reader-i18n">
{
"always": "Toujours",
"close": "{% trans 'Close' %}",
"expired": "{% trans 'QR code Expired' %}",
"from": "{% trans 'From' %}",
"invalid_content": "{% trans "This QR code isn't supported by this application." %}",
"invalid_signature": "{% trans 'Signature verification failed.' %}",
"invalid_title": "Invalid QR Code",
"never": "Jamais",
"not_supported": "{% trans "QR code reader isn\'t supported on your platform. Please update your browser." %}",
"not_yet_valid": "{% trans 'QR code not yet valid' %}",
"to": "{% trans 'To' %}",
"valid": "{% trans 'Valid QR code' %}"
}
</script>
<link rel="stylesheet" href="{% static 'qrcode/css/style.css' %}">
<script type="module" src="{% static 'qrcode/js/qrcode-reader.js' %}"></script>
</head>
<body>
{% if not started %}
{% trans "Reader isn't usable yet." %}
{% elif expired %}
{% trans "Reader has expired." %}
{% else %}
<qrcode-reader verify-key="{{ verify_key }}"></qrcode-reader>
{% endif %}
</body>
</html>

View File

@ -23,6 +23,12 @@ from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.soap import SOAPFault
PASS_THROUGH_MESSAGES = [
"La demande de logement n'existe pas dans le système.",
'Votre guichet enregistreur ne couvre pas au moins une des communes souhaitées de la demande de logement.',
]
class SNE(BaseResource, HTTPResource):
@ -61,12 +67,20 @@ class SNE(BaseResource, HTTPResource):
},
)
def get_demande_logement(self, request, demand_id, **kwargs):
if len(demand_id) != 18:
return {'err_desc': 'demand_id must contains 18 characters'}
client = self.soap_client(wsdl_url=self.wsdl_url, api_error=True)
cert_type = client.get_type('{http://ws.metier.nuu.application.i2/}base64Binary')
cert = cert_type(_value_1=self.cert_public_bytes)
res = client.service.getDemandeLogement(
numUnique=demand_id, nomCertificat=self.certificate_name, certificat=cert
)
try:
res = client.service.getDemandeLogement(
numUnique=demand_id, nomCertificat=self.certificate_name, certificat=cert
)
except SOAPFault as e:
message = e.data.get('soap_fault', {}).get('message', '')
if message in PASS_THROUGH_MESSAGES:
return {'err_desc': message}
raise
namespaces = {
'http://nuu.application.i2/': None,
}

View File

@ -15,6 +15,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import collections
import hashlib
import zeep
import zeep.helpers
@ -135,6 +136,9 @@ class SOAPConnector(BaseResource, HTTPResource):
method.endpoint_info.methods.append('get')
def check_status(self):
return self.operations_and_schemas
def get_endpoints_infos(self):
endpoints = super().get_endpoints_infos()
@ -143,6 +147,7 @@ class SOAPConnector(BaseResource, HTTPResource):
except Exception as e:
self.set_availability_status('down', message=exception_to_text(e)[:500])
return endpoints
self.set_availability_status('up')
for name, input_schema, output_schema in operations_and_schemas:
kwargs = {
@ -184,42 +189,61 @@ class SOAPConnector(BaseResource, HTTPResource):
operations_and_schemas.append((name, input_schema, output_schema))
return operations_and_schemas
def type2schema(self, xsd_type, keep_root=False, compress=False):
# simplify schema: when a type contains a unique element, it will try
# to match any dict or list with it on input and will flatten the
# schema on output.
if (
isinstance(xsd_type, zeep.xsd.ComplexType)
and len(xsd_type.elements) == 1
and not keep_root
and compress
):
if xsd_type.elements[0][1].max_occurs != 1:
@classmethod
def type2schema(cls, xsd_type, keep_root=False, compress=False):
seen = set()
def to_id(s):
return f'ref-{hashlib.md5(str(s).encode()).hexdigest()}'
def t2s(xsd_type):
type_name = xsd_type.qname or xsd_type.name
if isinstance(xsd_type, zeep.xsd.ComplexType):
if type_name in seen:
return {'$ref': '#' + to_id(type_name)}
seen.add(type_name)
# simplify schema: when a type contains a unique element, it will try
# to match any dict or list with it on input and will flatten the
# schema on output.
if (
isinstance(xsd_type, zeep.xsd.ComplexType)
and len(xsd_type.elements) == 1
and not keep_root
and compress
# and is not recursive
and xsd_type.elements[0][1].type != xsd_type
):
if xsd_type.elements[0][1].max_occurs != 1:
schema = {
'type': 'array',
'items': t2s(xsd_type.elements[0][1].type),
}
else:
schema = t2s(xsd_type.elements[0][1].type)
elif isinstance(xsd_type, zeep.xsd.ComplexType):
properties = collections.OrderedDict()
schema = {
'type': 'array',
'items': self.type2schema(xsd_type.elements[0][1].type, compress=compress),
'type': 'object',
'properties': properties,
'$anchor': to_id(type_name),
}
for key, element in xsd_type.elements:
if element.min_occurs > 0:
schema.setdefault('required', []).append(key)
element_schema = t2s(element.type)
if element.max_occurs == 'unbounded' or element.max_occurs > 1:
element_schema = {'type': 'array', 'items': element_schema}
properties[key] = element_schema
if not properties:
schema = {'type': 'null'}
elif isinstance(xsd_type, zeep.xsd.BuiltinType):
schema = {'type': 'string'}
else:
schema = self.type2schema(xsd_type.elements[0][1].type, compress=compress)
elif isinstance(xsd_type, zeep.xsd.ComplexType):
properties = collections.OrderedDict()
schema = {
'type': 'object',
'properties': properties,
}
for key, element in xsd_type.elements:
if element.min_occurs > 0:
schema.setdefault('required', []).append(key)
element_schema = self.type2schema(element.type, compress=compress)
if element.max_occurs == 'unbounded' or element.max_occurs > 1:
element_schema = {'type': 'array', 'items': element_schema}
properties[key] = element_schema
if not properties:
schema = {'type': 'null'}
elif isinstance(xsd_type, zeep.xsd.BuiltinType):
schema = {'type': 'string'}
else:
schema = {}
if xsd_type.qname:
schema['description'] = str(xsd_type.qname).replace('{http://www.w3.org/2001/XMLSchema}', 'xsd:')
return schema
schema = {}
if xsd_type.qname:
schema['description'] = str(xsd_type.qname).replace(
'{http://www.w3.org/2001/XMLSchema}', 'xsd:'
)
return schema
return t2s(xsd_type)

View File

@ -28,7 +28,10 @@ class Command(BaseCommand):
def add_arguments(self, parser):
parser.add_argument(
'frequency', metavar='FREQUENCY', type=str, help='hourly/daily/weekly/monthly/availability/jobs'
'frequency',
metavar='FREQUENCY',
type=str,
help='every5min/hourly/daily/weekly/monthly/availability/jobs',
)
parser.add_argument(
'--connector',
@ -46,7 +49,7 @@ class Command(BaseCommand):
)
def handle(self, frequency, **options):
if frequency not in ('hourly', 'daily', 'weekly', 'monthly', 'availability', 'jobs'):
if frequency not in ('every5min', 'hourly', 'daily', 'weekly', 'monthly', 'availability', 'jobs'):
raise CommandError('unknown frequency')
errors = []
for app in get_all_apps():

View File

@ -8,6 +8,7 @@ import logging
import os
import re
import sys
import time
import traceback
import uuid
from contextlib import contextmanager
@ -38,6 +39,7 @@ import passerelle
from passerelle.forms import GenericConnectorForm
from passerelle.utils import ImportSiteError
from passerelle.utils.api import endpoint
from passerelle.utils.defer import run_later_if_in_transaction
from passerelle.utils.jsonresponse import APIError
from passerelle.utils.sftp import SFTP, SFTPField
@ -234,8 +236,23 @@ class BaseResource(models.Model):
except AvailabilityParameters.DoesNotExist:
return AvailabilityParameters(resource_type=resource_type, resource_pk=self.id)
soap_client_cache_timeout = 0
soap_client_cache = {}
def soap_client(self, **kwargs):
return passerelle.utils.soap.SOAPClient(resource=self, **kwargs)
if self.soap_client_cache_timeout:
key = (self, kwargs['wsdl_url'])
if key in self.soap_client_cache:
client, timestamp = self.soap_client_cache[key]
if timestamp > time.time() - self.soap_client_cache_timeout:
return client
client = passerelle.utils.soap.SOAPClient(resource=self, **kwargs)
if self.soap_client_cache_timeout:
self.soap_client_cache[key] = client, time.time()
return client
@classmethod
def get_verbose_name(cls):
@ -571,6 +588,9 @@ class BaseResource(models.Model):
else:
ResourceStatus.objects.filter(pk=current_status.pk).update(message=message)
def every5min(self):
pass
def hourly(self):
pass
@ -1042,7 +1062,10 @@ class ProxyLogger:
(exc_type, exc_value, dummy) = sys.exc_info()
attr['extra']['error_summary'] = traceback.format_exception_only(exc_type, exc_value)
ResourceLog.objects.create(**attr)
# keep log even if transaction fails if:
# * it's at least a warning
# * or if logger is configured for debug
run_later_if_in_transaction(ResourceLog.objects.create, **attr)
admins = settings.ADMINS
logging_parameters = self.connector.logging_parameters

View File

@ -110,7 +110,10 @@ def censor(string):
return re.sub(r'://([^/]*):([^/]*?)@', r'://\1:***@', string)
def render_json_schema(schema):
def render_json_schema(schema, anchor_map=None):
if anchor_map is None:
anchor_map = {}
if not isinstance(schema, dict):
if schema is True:
return mark_safe('<em>%s</em>') % _('always valid')
@ -133,6 +136,11 @@ def render_json_schema(schema):
def html_type(s):
return '<span class="type">%s</span>' % s
def to_id(ref):
_ref = ref.lstrip('#')
_id = id(anchor_map.get(_ref))
return f'schema-object-{_ref}-{_id}'
if 'anyOf' in schema:
return many_of('anyOf', schema['anyOf'])
@ -145,10 +153,17 @@ def render_json_schema(schema):
original_schema = schema
schema = schema.copy()
schema.pop('$schema', None)
schema.pop('$id', None)
_anchor = schema.pop('$anchor', None)
if _anchor:
anchor_map.setdefault(_anchor, original_schema)
title = schema.pop('title', None)
description = schema.pop('description', None)
typ = schema.pop('type', None)
_ref = schema.pop('$ref', None)
if _ref and _ref.startswith('#'):
target_schema = anchor_map.get(_ref[1:], {})
target_title = target_schema.get('title') or target_schema.get('description') or 'referenced schema'
return format_html('<a href="#{}">{}</a>', to_id(_ref), target_title)
if typ == 'null':
return mark_safe(html_type('null'))
if typ == 'string':
@ -181,10 +196,12 @@ def render_json_schema(schema):
if typ == 'array':
s = html_type('array') + ' '
if 'items' in schema:
s += render_json_schema(schema['items'])
s += render_json_schema(schema['items'], anchor_map)
return mark_safe(s)
if typ == 'object':
s = html_type('object')
if _anchor:
s += f'<a id="{to_id(_anchor)}"></a>'
unflatten = schema.pop('unflatten', False)
merge_extra = schema.pop('merge_extra', False)
properties = schema.pop('properties', {})
@ -211,6 +228,9 @@ def render_json_schema(schema):
def render_property_schema(key, html, sub):
nonlocal s
_anchor = sub.get('$anchor', None)
if _anchor:
anchor_map.setdefault(_anchor, sub.copy())
required = key in required_keys
sub_description = sub.pop('description', '')
sub_title = sub.pop('title', '')
@ -226,7 +246,7 @@ def render_json_schema(schema):
if sub_title or '\n' in sub_description:
s += format_html('\n<p class="description">{}</p>', sub_description)
if sub:
s += format_html('\n{0}', render_json_schema(sub))
s += format_html('\n{0}', render_json_schema(sub, anchor_map))
s += '</li>'
if properties or pattern_properties:

View File

@ -172,7 +172,9 @@ class CaluireAxel(BaseResource):
for child in family_data.get('MEMBRE', []):
child['id'] = child['IDENT']
child['text'] = '{} {}'.format(child['PRENOM'].strip(), child['NOM'].strip()).strip()
child['text'] = '{} {}'.format(
(child['PRENOM'] or '').strip(), (child['NOM'] or '').strip()
).strip()
cache.set(cache_key, family_data, 30) # 30 seconds
return family_data

View File

@ -0,0 +1,69 @@
# Generated by Django 3.2.18 on 2023-10-25 20:41
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='IsereESRH',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'basic_auth_username',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication username'
),
),
(
'basic_auth_password',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication password'
),
),
(
'client_certificate',
models.FileField(
blank=True, null=True, upload_to='', verbose_name='TLS client certificate'
),
),
(
'trusted_certificate_authorities',
models.FileField(blank=True, null=True, upload_to='', verbose_name='TLS trusted CAs'),
),
(
'verify_cert',
models.BooleanField(blank=True, default=True, verbose_name='TLS verify certificates'),
),
(
'http_proxy',
models.CharField(blank=True, max_length=128, verbose_name='HTTP and HTTPS proxy'),
),
('base_url', models.URLField(verbose_name='Base API URL')),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_isere_esrh_isereesrh_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'ESRH Isère',
},
),
]

View File

@ -0,0 +1,151 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import re
from datetime import datetime, timezone
from django.db import models
from django.utils.translation import gettext_lazy as _
from requests import ConnectionError, RequestException, Timeout
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
def iso_now():
now = datetime.now(timezone.utc)
local_now = now.astimezone()
return local_now.isoformat()
class IsereESRH(BaseResource, HTTPResource):
category = _('Business Process Connectors')
base_url = models.URLField(_('Base API URL'))
class Meta:
verbose_name = _('ESRH Isère')
def _get(self, endpoint, params=None):
try:
response = self.requests.get(f'{self.base_url}/api/v2/{endpoint}', params=params)
response.raise_for_status()
except (RequestException, ConnectionError, Timeout) as e:
raise APIError('HTTP request failed', data={'exception': str(e)})
try:
response_json = response.json()
except ValueError as e:
raise APIError('ESRH returned invalid json', data={'exception': str(e)})
if (
not isinstance(response_json, dict)
or response_json.get('values') is None
or not isinstance(response_json['values'], list)
or any(not isinstance(item, dict) for item in response_json['values'])
):
raise APIError(
'ESRH returned malformed json : expecting a dictionary with a "values" key containing a list of objects.'
)
return response_json['values']
@endpoint(
description=_('Get official informations'),
parameters={
'number': {'description': _('Official registration number'), 'example_value': '500'},
'authority': {'description': _('Public authority'), 'example_value': 'CG38'},
},
)
def official(self, request, number, authority):
agents = self._get('Agent', params={'numero': number, 'collectivite': authority})
if len(agents) == 0:
return {'data': None}
agent = agents[0]
if 'agentId' not in agent:
raise APIError(
'Malformed response : values elements are expected to be objects with an "agentId" key'
)
agent_id = agent['agentId']
agent['DossiersStatutaire'] = self._get(
f'Agent/{agent_id}/DossiersStatutaire', params={'aDate': iso_now()}
)
return {'data': agent}
@endpoint(
description=_('Get entities'),
parameters={
'label_pattern': {
'description': _('Filter entities whose label matches this regex (case insensitive)'),
'example_value': '^dir\\..*',
},
'code_pattern': {
'description': _('Filter entities whose code matches this regex (case insensitive)'),
'example_value': '^6517.*',
},
},
)
def entities(self, request, label_pattern=None, code_pattern=None):
entities = self._get('Entite', params={'aDate': iso_now()})
if label_pattern:
label_pattern = re.compile(label_pattern, re.IGNORECASE)
if code_pattern:
code_pattern = re.compile(code_pattern, re.IGNORECASE)
result = []
for entity in entities:
id = entity.get('entiteId')
label = entity.get('libelle', '')
code = entity.get('code', '')
if label_pattern and not label_pattern.match(label):
continue
if code_pattern and not code_pattern.match(code):
continue
result.append(entity | {'id': id, 'text': label})
return {'data': result}
@endpoint(
name='job-types',
description=_('Get job types'),
parameters={
'authority': {'description': _('Public authority'), 'example_value': 'CG38'},
},
)
def job_types(self, request, authority):
job_types = self._get(
'Poste', params={'codeCollectivite': authority, 'avecLibellePoste': True, 'aDate': iso_now()}
)
result = []
for job_type in job_types:
id = job_type.get('posteId')
labels = job_type.get('libelles', [])
label = 'N/A' if len(labels) == 0 else labels[0]['libelle']
result.append(job_type | {'id': id, 'text': label})
return {'data': result}

View File

@ -15,6 +15,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import base64
import copy
import phonenumbers
import requests
@ -35,7 +36,7 @@ ATTACHMENT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': _('Attachment and degree data.'),
'description': '',
'required': ['application_id', 'name', 'file'],
'required': ['application_id', 'name'],
'type': 'object',
'properties': {
'application_id': {
@ -56,26 +57,38 @@ ATTACHMENT_SCHEMA = {
},
'file': {
'description': _('File to attach.'),
'type': 'object',
'required': ['filename', 'content_type', 'content'],
'properties': {
'filename': {
'description': _('File name'),
'type': 'string',
'oneOf': [
{
'type': 'object',
'required': ['filename', 'content_type', 'content'],
'properties': {
'filename': {
'description': _('File name'),
'type': 'string',
},
'content_type': {
'description': _('MIME type'),
'type': 'string',
},
'content': {
'description': _('Content'),
'type': 'string',
},
},
},
'content_type': {
'description': _('MIME type'),
'type': 'string',
},
'content': {
'description': _('Content'),
'type': 'string',
},
},
{'type': 'null'},
],
},
},
}
DIPLOMA_SCHEMA = copy.deepcopy(ATTACHMENT_SCHEMA)
DIPLOMA_SCHEMA['properties']['degree_level'] = {
'description': _("ID of an element of the data source 'niveau-diplome'."),
'type': 'string',
'pattern': '^[0-9]*$',
}
def boolean_field(description):
return {
@ -243,9 +256,8 @@ APPLICATION_SCHEMA = {
},
'rgpd_agreement': boolean_field(_('RGPD agreement.')),
'job_types': {
'description': _("IDs of elements of the data source 'type-emploi'."),
'type': 'array',
'items': {'type': 'string', 'pattern': '^[0-9]*$'},
'description': _('Wanted job types'),
'type': 'string',
},
'job_realms': {
'description': _("IDs of elements of the data source 'domaine-emploi'."),
@ -474,7 +486,7 @@ class Resource(BaseResource, HTTPResource):
def update_announces(self):
try:
results = self.http_request('GET', 'data/annonce?viewIntegrationName=api_publik')
results = self.http_request('GET', 'data/annonce?viewIntegrationName=api_publik&count=200')
except requests.RequestException:
raise UpdateError(_('Service is unavailable'))
announces = []
@ -489,7 +501,6 @@ class Resource(BaseResource, HTTPResource):
('civilite', 'data/civilite', None, 200),
('nationalite', 'data/nationalite1', None, 200),
('situation_actuelle', 'data/situation_actuelle', None, 200),
('type_emploi', 'data/type_emploi', None, 200),
('domaine_emploi', 'data/domaine_emploi', None, 200),
('sous_domaine_emploi', 'data/sous_domaine_emploi', 'domaine_emploi', 200),
('emploi', 'custom/emploi', 'sous_domaine_emploi', None),
@ -503,13 +514,12 @@ class Resource(BaseResource, HTTPResource):
'nationalite': 'R1249730',
'situation_actuelle': 'R1258320',
'annonce': 'R14848305',
'type_emploi': 'R1249707',
'domaine_emploi': 'R60845221',
'sous_domaine_emploi': 'R60845244',
'emploi': 'R15017962',
'niveau_diplome': 'R1249737',
'habilitation': 'R1276043',
'offre': FIELD_ANNOUNCE_FKEY_DEMANDE_DE_PERSONNEL,
'offre': 'R14846954',
}
def update_referentiels(self):
@ -595,20 +605,21 @@ class Resource(BaseResource, HTTPResource):
return int(id)
phone = post_data.get('phone', None)
if phone is not None:
formatted_phone = ''
if phone:
try:
parsed_phone = phonenumbers.parse(phone, 'FR')
except phonenumbers.NumberParseException:
raise APIError(_('Couldn\'t recognize provided phone number.'))
formatted_phone = f'+{parsed_phone.country_code} {parsed_phone.national_number}'
formatted_phone = f'+{parsed_phone.country_code} {parsed_phone.national_number}'
announce_id = _get_id('announce_id')
offer_id = None
if announce_id is not None:
# passerelle catch DoesNotExist and converts it to 404
announce_document = self.announces_documents.get(external_id=f'announce-{announce_id}')
offer_id = announce_document.data['offer_id']
offer_id = str(announce_document.data['offer_id'])
request_data = {
'type_de_candidature': post_data.get('type', 'E'),
@ -645,7 +656,7 @@ class Resource(BaseResource, HTTPResource):
self.REFERENTIELS_FKEYS['origine_candidature']: _get_id('origin'),
'precision_origine_candidature': post_data.get('origin_precisions', None),
'accord_RGPD': get_bool(post_data.get('rgpd_agreement', False)),
self.REFERENTIELS_FKEYS['type_emploi']: [int(id) for id in post_data.get('job_types', [])],
'type_emploi_souhaite': post_data.get('job_types', None),
self.REFERENTIELS_FKEYS['domaine_emploi']: [int(id) for id in post_data.get('job_realms', [])],
self.REFERENTIELS_FKEYS['sous_domaine_emploi']: [
int(id) for id in post_data.get('job_families', [])
@ -686,11 +697,13 @@ class Resource(BaseResource, HTTPResource):
self.http_request(
'POST',
f'data/candidature/{application_id}/fields/{attachment_name}?viewIntegrationName=api_publik',
json={
data={
'contentType': file['content_type'],
'value': file['content'],
'fileName': file['filename'],
},
files={
'value': (None, file['content'], None),
},
)
return {'err': 0}
@ -699,31 +712,35 @@ class Resource(BaseResource, HTTPResource):
name='attach-degree',
post={
'description': _('Attach a degree to an application.'),
'request_body': {'schema': {'application/json': ATTACHMENT_SCHEMA}},
'request_body': {'schema': {'application/json': DIPLOMA_SCHEMA}},
},
)
def attach_degree(self, request, post_data):
application_id = post_data['application_id']
degree_label = post_data['name']
file = post_data['file']
file = post_data.get('file')
degree_level = post_data.get('degree_level')
degree_data = self.http_request(
'POST',
'data/diplome2?viewIntegrationName=api_publik',
json={'intitule_diplome': degree_label, 'R1258215': application_id},
json={'intitule_diplome': degree_label, 'R1258215': application_id, 'R79264997': degree_level},
)
degree_id = degree_data[0]['id']
self.http_request(
'POST',
f'data/diplome2/{degree_id}/fields/justificatif_diplome?viewIntegrationName=api_publik',
json={
'contentType': file['content_type'],
'value': file['content'],
'fileName': file['filename'],
},
)
if file is not None:
self.http_request(
'POST',
f'data/diplome2/{degree_id}/fields/justificatif_diplome?viewIntegrationName=api_publik',
data={
'contentType': file['content_type'],
'fileName': file['filename'],
},
files={
'value': (None, file['content'], None),
},
)
return {'err': 0}

View File

@ -0,0 +1,17 @@
# Generated by Django 3.2.18 on 2023-11-17 16:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('toulouse_maelis', '0012_subscription'),
]
operations = [
migrations.AddField(
model_name='invoice',
name='maelis_no_more_returned_date',
field=models.DateTimeField(null=True),
),
]

View File

@ -30,15 +30,18 @@ from django.core.serializers.json import DjangoJSONEncoder
from django.db import models, transaction
from django.db.models import JSONField
from django.http import Http404, HttpResponse
from django.template.loader import render_to_string as render_template_to_string
from django.utils import dateformat
from django.utils.dateparse import parse_date
from django.utils.dateparse import parse_date, parse_datetime
from django.utils.text import slugify
from django.utils.timezone import now
from requests.exceptions import RequestException
from zeep.helpers import serialize_object
from zeep.wsse.username import UsernameToken
from passerelle.apps.base_adresse.models import CityModel
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.base.signature import sign_url
from passerelle.utils.api import endpoint
from passerelle.utils.conversion import simplify
from passerelle.utils.jsonresponse import APIError
@ -95,6 +98,7 @@ class ToulouseMaelis(BaseResource, HTTPResource):
category = 'Connecteurs métiers'
_category_ordering = ['Famille', 'Activités']
soap_client_cache_timeout = 300 # 5 minutes of cache for zeep.Client
class Meta:
verbose_name = 'Toulouse Maelis'
@ -155,6 +159,39 @@ class ToulouseMaelis(BaseResource, HTTPResource):
# delete extraneous items
self.referential.filter(referential_name=referential_name, updated__lt=last_update).delete()
def update_catalog_referential(self):
last_update = now()
ref_date = last_update.date()
try:
data = self.call(
'Activity',
'readActivityList',
# pass schoolyear as '1970', it's not actually used and activities will be
# returned according to dateStartCalend/dateEndCalend.
schoolyear='1970',
dateStartCalend=(ref_date - datetime.timedelta(days=365)).isoformat(),
dateEndCalend=(ref_date + datetime.timedelta(days=365)).isoformat(),
)
except Exception as e:
raise UpdateError('Service indisponible : %s' % str(e))
for item in data or []:
id_key = item['activityPortail']['idAct']
text = item['activityPortail'].get('libelle2') or item['activityPortail']['libelle'] or ''
text = text.strip()
self.referential.update_or_create(
resource_id=self.id,
referential_name='Activity',
item_id=id_key,
defaults={
'item_text': text,
'item_data': dict({'id': id_key, 'text': text}, **item),
'updated': last_update,
},
)
# delete extraneous items
self.referential.filter(referential_name='Activity', updated__lt=last_update).delete()
def get_referential_data(self, service_name, referential_name):
try:
return self.call(service_name, 'read' + referential_name + 'List')
@ -222,6 +259,7 @@ class ToulouseMaelis(BaseResource, HTTPResource):
if referential_name in ['Direct', 'Service']:
id_key, text_key = 'id', 'lib1'
self.update_referential(referential_name, data, id_key, text_key)
self.update_catalog_referential()
def update_ape_referentials(self):
indicators = self.call('Ape', 'readApeIndicatorList')
@ -257,11 +295,13 @@ class ToulouseMaelis(BaseResource, HTTPResource):
super().daily()
self.update_referentials()
def every5min(self):
self.update_activity_referentials()
def update_referentials(self):
try:
self.update_family_referentials()
self.update_site_referentials()
self.update_activity_referentials()
self.update_ape_referentials()
self.update_invoice_referentials()
# merge zip codes from base adresse into town referential
@ -710,10 +750,8 @@ class ToulouseMaelis(BaseResource, HTTPResource):
if value is None:
dico[key] = ''
def read_rl_list_raw(self, family_id, text_template=None, income_year=None):
def read_rl_list_raw(self, family_id, text_template, income_year=None):
result = self.get_family_raw(family_id, incomeYear=income_year)
if not text_template:
text_template = '{{ lastname }} {{ firstname }}'
for rlg in 'RL1', 'RL2':
item = result.get(rlg)
@ -725,10 +763,8 @@ class ToulouseMaelis(BaseResource, HTTPResource):
item['family_id'] = family_id
yield item
def read_child_list_raw(self, family_id, text_template=None):
def read_child_list_raw(self, family_id, text_template):
result = self.get_family_raw(family_id)
if not text_template:
text_template = '{{ lastname }} {{ firstname }}'
for item in result['childList']:
self.add_text_value_to_child(item)
@ -1116,17 +1152,26 @@ class ToulouseMaelis(BaseResource, HTTPResource):
def link(self, request, NameID, post_data):
family_id = post_data['family_id']
response = self.call('Family', 'readFamily', dossierNumber=family_id)
if not (
response['RL1']['birth']
and isinstance(response['RL1']['birth'].get('dateBirth'), datetime.datetime)
):
raise APIError("Maelis provides an invalid dateBirth for RL1 on '%s' family" % family_id)
if not (
response['RL1']['firstname'] == post_data['firstname'].upper()
and response['RL1']['lastname'] == post_data['lastname'].upper()
and response['RL1']['birth']['dateBirth'].strftime('%Y-%m-%d') == post_data['dateBirth']
):
raise APIError("RL1 does not match '%s' family" % family_id)
for rlg in 'RL1', 'RL2':
if not response.get(rlg):
continue
if not (
response[rlg]['birth']
and isinstance(response[rlg]['birth'].get('dateBirth'), datetime.datetime)
):
self.logger.warning(
"Maelis provides an invalid dateBirth for %s on '%s' family", rlg, family_id
)
continue
if (
response[rlg]['firstname'] == post_data['firstname'].upper()
and response[rlg]['lastname'] == post_data['lastname'].upper()
and response[rlg]['birth']['dateBirth'].strftime('%Y-%m-%d') == post_data['dateBirth']
):
break
else:
raise APIError("Data provided does not match any RL on '%s' family" % family_id)
# put invoices into cache
for regie in self.get_referential('Regie'):
@ -1148,6 +1193,58 @@ class ToulouseMaelis(BaseResource, HTTPResource):
link.delete()
return {'data': 'ok'}
@endpoint(
display_category='Famille',
description='Obtenir les comptes usager liés à une famille',
name='get-link-list',
perm='can_access',
parameters={
'NameID': {'description': 'Publik NameID'},
'family_id': {'description': 'Numéro de DUI'},
},
)
def get_link_list(self, request, NameID=None, family_id=None, text_template=None):
family_id = family_id or self.get_link(NameID).family_id
data = [
{'id': x['name_id'], 'context': {'link': x}}
for x in self.link_set.filter(family_id=family_id).values()
]
for item in data:
del item['context']['link']['id']
del item['context']['link']['resource_id']
# call authentic to add user data to context
if getattr(settings, 'KNOWN_SERVICES', {}).get('authentic'):
idp_service = list(settings.KNOWN_SERVICES['authentic'].values())[0]
for item in data:
api_url = sign_url(
urljoin(
idp_service['url'], 'api/users/%s/?orig=%s' % (item['id'], idp_service.get('orig'))
),
key=idp_service.get('secret'),
)
try:
response = self.requests.get(api_url)
except RequestException:
pass
else:
if response.status_code == 200:
item['context']['user'] = response.json()
for key in 'date_joined', 'last_login':
value = item['context']['user'].get(key)
if value:
item['context']['user'][key] = parse_datetime(value)
if item['context']['user'].get('password'):
del item['context']['user']['password']
for item in data:
item['text'] = (
render_template_to_string('toulouse_maelis/family_link_template.txt', item['context'])
.replace('\n', '')
.strip()
)
return {'data': data}
@endpoint(
display_category='Famille',
description='Rechercher un dossier famille',
@ -1215,12 +1312,18 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'family_id': {'description': 'Numéro de DUI'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ lastname }} {{ firstname }}',
},
'income_year': {'description': 'Année de revenu pour filtrer les quotients'},
},
)
def read_rl_list(self, request, NameID=None, family_id=None, text_template=None, income_year=None):
def read_rl_list(
self,
request,
NameID=None,
family_id=None,
text_template='{{ lastname }} {{ firstname }}',
income_year=None,
):
family_id = family_id or self.get_link(NameID).family_id
return {'data': list(self.read_rl_list_raw(family_id, text_template))}
@ -1233,16 +1336,14 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'family_id': {'description': 'Numéro de DUI'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ lastname }} {{ firstname }}',
},
},
)
def read_person_list(self, request, NameID=None, family_id=None, text_template=None):
def read_person_list(
self, request, NameID=None, family_id=None, text_template='{{ lastname }} {{ firstname }}'
):
family_id = family_id or self.get_link(NameID).family_id
result = self.get_family_raw(family_id)
if not text_template:
text_template = '{{ lastname }} {{ firstname }}'
data = []
for item in result['emergencyPersonList']:
self.add_text_value_to_person(item)
@ -1261,11 +1362,12 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'family_id': {'description': 'Numéro de DUI'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ lastname }} {{ firstname }}',
},
},
)
def read_child_list(self, request, NameID=None, family_id=None, text_template=None):
def read_child_list(
self, request, NameID=None, family_id=None, text_template='{{ lastname }} {{ firstname }}'
):
family_id = family_id or self.get_link(NameID).family_id
return {'data': list(self.read_child_list_raw(family_id, text_template))}
@ -1278,16 +1380,19 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'family_id': {'description': 'Numéro de DUI'},
'rl_text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ lastname }} {{ firstname }}',
},
'child_text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ lastname }} {{ firstname }}',
},
},
)
def read_rl_and_child_list(
self, request, NameID=None, family_id=None, rl_text_template=None, child_text_template=None
self,
request,
NameID=None,
family_id=None,
rl_text_template='{{ lastname }} {{ firstname }}',
child_text_template='{{ lastname }} {{ firstname }}',
):
family_id = family_id or self.get_link(NameID).family_id
return {
@ -1305,16 +1410,19 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'family_id': {'description': 'Numéro de DUI'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text',
'example_value': '{{ personInfo.lastname }} {{ personInfo.firstname }}',
},
},
)
def read_child_person_list(self, request, child_id, NameID=None, family_id=None, text_template=None):
def read_child_person_list(
self,
request,
child_id,
NameID=None,
family_id=None,
text_template='{{ personInfo.lastname }} {{ personInfo.firstname }}',
):
family_id = family_id or self.get_link(NameID).family_id
result = self.get_child_raw(family_id, child_id)
if not text_template:
text_template = '{{ personInfo.lastname }} {{ personInfo.firstname }}'
data = []
for item in result['authorizedPersonList']:
self.add_text_value_to_child_person(item)
@ -2533,9 +2641,29 @@ class ToulouseMaelis(BaseResource, HTTPResource):
display_category='Inscriptions',
description='Lister les années scolaires',
name='read-school-years-list',
parameters={
'subscribable': {
'description': "N'afficher que les années ouvertes aux inscriptions",
'example_value': '0',
},
},
)
def read_school_years_list(self, request):
return {'data': self.get_referential('YearSchool')}
def read_school_years_list(self, request, subscribable='1'):
subscribable = utils.strtobool(subscribable)
referential = self.get_referential('YearSchool')
data = []
for item in referential:
if subscribable is True:
start_date = item.get('dateStartSubscribeSchool')
end_date = item.get('dateEndSubscribeSchool')
if not (start_date and end_date):
continue
start_date = parse_datetime(start_date)
end_date = parse_datetime(end_date)
if not (start_date <= now() <= end_date):
continue
data.append(item)
return {'data': data}
@endpoint(
display_category='Inscriptions',
@ -2662,6 +2790,8 @@ class ToulouseMaelis(BaseResource, HTTPResource):
)
def create_child_school_pre_registration(self, request, post_data):
response = self.call('Family', 'preSubscribeSchoolPerim', **post_data)
if not response.get('subscribeSchoolBean'):
raise APIError(response.get('returnMessage') or 'no data returned')
return {'data': response}
@endpoint(
@ -2696,20 +2826,12 @@ class ToulouseMaelis(BaseResource, HTTPResource):
display_category='Inscriptions',
description='Obtenir le catalogue des activités loisir, avec leurs critères de recherche',
name='read-activity-list',
parameters={
'ref_date': {
'description': "Date de référence, utilisée pour déduire l'année scolaire",
'type': 'date',
},
},
)
def read_activity_list(self, request, ref_date=None):
if not ref_date:
ref_date = now().date()
def read_activity_list(self, request):
labels = {
'service': 'Service',
'nature': "Nature de l'activité",
'type': "Type de l'activité",
'type': 'Discipline',
'public': 'Public',
'day': 'Jours',
'place': 'Lieu',
@ -2718,16 +2840,6 @@ class ToulouseMaelis(BaseResource, HTTPResource):
all_criterias = {key: {'text': value, 'data': {}} for key, value in labels.items()}
criterias = {key: {'text': value, 'data': {}} for key, value in labels.items()}
activities = self.call(
'Activity',
'readActivityList',
# pass schoolyear as '1970', it's not actually used and activities will be
# returned according to dateStartCalend/dateEndCalend.
schoolyear='1970',
dateStartCalend=(ref_date - datetime.timedelta(days=365)).isoformat(),
dateEndCalend=(ref_date + datetime.timedelta(days=365)).isoformat(),
)
def add_criteria(label_key, criteria_key, criteria_value):
if not criteria_value:
return
@ -2748,15 +2860,11 @@ class ToulouseMaelis(BaseResource, HTTPResource):
]
data = []
for activity in activities:
for activity in self.get_referential('Activity'):
activity_type = activity['activityPortail'].get('activityType')
activity_nature = activity_type.get('natureSpec') if activity_type else None
if not activity_nature or activity_nature['code'] not in self.get_loisir_nature_codes():
continue
activity['id'] = activity['activityPortail']['idAct']
activity['text'] = (
activity['activityPortail']['libelle2'] or activity['activityPortail']['libelle']
)
service_id = activity['activityPortail']['idService']
service_text = self.get_referential_value('Service', service_id, default=None)
activity['activityPortail']['idService_text'] = service_text
@ -2780,9 +2888,13 @@ class ToulouseMaelis(BaseResource, HTTPResource):
unit['text'] = unit['libelle']
criterias['public']['data'] = {}
for key, value in utils.get_public_criterias(
datetime.date.today(), unit['birthDateStart'], unit['birthDateEnd']
):
start_dob = unit['birthDateStart']
end_dob = unit['birthDateEnd']
if start_dob:
start_dob = parse_date(start_dob)
if end_dob:
end_dob = parse_date(end_dob)
for key, value in utils.get_public_criterias(datetime.date.today(), start_dob, end_dob):
add_criteria('public', key, value)
update_criterias_order_field(criterias, ['public'])
@ -2809,7 +2921,6 @@ class ToulouseMaelis(BaseResource, HTTPResource):
return {
'data': data,
'meta': {
'ref_date': ref_date.isoformat(),
'all_criterias': all_criterias,
'all_criterias_order': ['service', 'nature', 'type', 'public', 'day', 'place'],
},
@ -2846,14 +2957,12 @@ class ToulouseMaelis(BaseResource, HTTPResource):
type_ids=None,
start_date=None,
end_date=None,
text_template=None,
text_template='{{ activity.libelle2|default:activity.libelle1 }}',
):
family_id = family_id or self.get_link(NameID).family_id
reference_year = None
if start_date and end_date:
start_date, end_date, reference_year = self.get_start_and_end_dates(start_date, end_date)
if not text_template:
text_template = '{{ activity.libelle2|default:activity.libelle1 }}'
response = self.get_person_activity_list_raw(
family_id,
@ -2883,7 +2992,6 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'end_date': {'description': 'Fin de la période'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text (URL encoding)',
'example_value': '{{ libelle }}',
},
},
)
@ -2896,14 +3004,12 @@ class ToulouseMaelis(BaseResource, HTTPResource):
family_id=None,
start_date=None,
end_date=None,
text_template=None,
text_template='{{ libelle }}',
):
family_id = family_id or self.get_link(NameID).family_id
reference_year = None
if start_date and end_date:
start_date, end_date, reference_year = self.get_start_and_end_dates(start_date, end_date)
if not text_template:
text_template = '{{ libelle }}'
response = self.get_person_activity_list_raw(
family_id,
@ -2940,7 +3046,6 @@ class ToulouseMaelis(BaseResource, HTTPResource):
'end_date': {'description': 'Fin de la période'},
'text_template': {
'description': 'Gabarit utilisé pour la valeur text (URL encoding)',
'example_value': '{{ libelle }}',
},
},
)
@ -2954,14 +3059,12 @@ class ToulouseMaelis(BaseResource, HTTPResource):
family_id=None,
start_date=None,
end_date=None,
text_template=None,
text_template='{{ place.lib2|default:place.lib1 }}',
):
family_id = family_id or self.get_link(NameID).family_id
reference_year = None
if start_date and end_date:
start_date, end_date, reference_year = self.get_start_and_end_dates(start_date, end_date)
if not text_template:
text_template = '{{ place.lib2|default:place.lib1 }}'
response = self.get_person_activity_list_raw(
family_id,
@ -3852,10 +3955,17 @@ class ToulouseMaelis(BaseResource, HTTPResource):
parameters={
'activity_type': {'description': "Type de l'activité.", 'example_value': 'CRECHCO'},
'code_psu': {'description': 'Code PSU.', 'example_value': 'REGULAR'},
'service_ids': {
'description': 'Codes des services à filtrer, séparées par des virgules.',
'example_value': 'A10054639474, A10054639473',
},
},
)
def read_nursery_list(self, request, activity_type=None, code_psu=None):
def read_nursery_list(self, request, activity_type=None, code_psu=None, service_ids=None):
nurseries = self.get_referential('Nursery')
if service_ids:
service_codes = [x.strip() for x in str(service_ids or '').split(',') if x.strip()]
nurseries = [n for n in nurseries if n.get('idService') in service_codes]
if activity_type:
nurseries = [n for n in nurseries if n['activityType']['code'] == activity_type]
if code_psu:
@ -3880,15 +3990,26 @@ class ToulouseMaelis(BaseResource, HTTPResource):
parameters={
'activity_type': {'description': "Type de l'activité.", 'example_value': 'CRECHCO'},
'code_psu': {'description': 'Code PSU. (REGULAR par défaut)'},
'service_ids': {
'description': 'Codes des services à filtrer, séparées par des virgules.',
'example_value': 'A10054639474, A10054639473',
},
},
)
def get_nursery_geojson(self, request, activity_type=None, code_psu='REGULAR'):
def get_nursery_geojson(self, request, activity_type=None, code_psu='REGULAR', service_ids=None):
nurseries = self.get_referential('Nursery')
geojson = {
'type': 'FeatureCollection',
'features': [],
}
service_codes = []
if service_ids:
service_codes = [x.strip() for x in str(service_ids or '').split(',') if x.strip()]
for item in nurseries:
if service_codes and item.get('idService') not in service_codes:
continue
if activity_type and item['activityType']['code'] != activity_type:
continue
if not item['place']['longitude'] or not item['place']['latitude']:
@ -4009,6 +4130,7 @@ class ToulouseMaelis(BaseResource, HTTPResource):
def get_invoices(self, family_id, regie_id):
self.assert_key_in_referential('Regie', regie_id, 'regie_id parameter')
known_invoice_ids = set()
try:
result = self.call(
'Invoice',
@ -4019,19 +4141,22 @@ class ToulouseMaelis(BaseResource, HTTPResource):
dateEnd=now().strftime(utils.json_date_format),
)
except SOAPServiceUnreachable:
pass
known_invoice_ids = None
else:
last_update = now()
for item in result:
try:
invoice = self.invoice_set.get(regie_id=regie_id, invoice_id=item['numInvoice'])
except Invoice.DoesNotExist:
invoice = self.invoice_set.create(
regie_id=regie_id,
invoice_id=item['numInvoice'],
family_id=family_id,
maelis_data=item,
maelis_data_update_date=now(),
)
invoice_id = item['numInvoice']
known_invoice_ids.add(invoice_id)
invoice, created = self.invoice_set.get_or_create(
regie_id=regie_id,
invoice_id=invoice_id,
defaults={
'family_id': family_id,
'maelis_data': item,
'maelis_data_update_date': last_update,
},
)
if created:
self.logger.info("Ajout de %s sur la famille '%s'", repr(invoice), family_id)
else:
if invoice.family_id != family_id:
@ -4046,15 +4171,34 @@ class ToulouseMaelis(BaseResource, HTTPResource):
content_two = json.dumps(item, sort_keys=True, indent=2, cls=DjangoJSONEncoder)
if content_one != content_two:
invoice.maelis_data = item
invoice.maelis_data_update_date = now()
invoice.save()
invoice.maelis_data_update_date = last_update
complete_diff = difflib.ndiff(content_one.split('\n'), content_two.split('\n'))
diff = [x for x in complete_diff if x[0] in ['-', '+']]
self.logger.info(
"Mise à jour de %s sur la famille '%s': %s", repr(invoice), family_id, diff
)
invoice.maelis_no_more_returned_date = None
invoice.save()
return self.invoice_set.filter(regie_id=regie_id, family_id=family_id)
# remind invoices that are no more returned by maelis
for invoice in self.invoice_set.filter(
regie_id=regie_id, family_id=family_id, updated__lt=last_update
):
invoice.maelis_no_more_returned_date = last_update
invoice.save()
qs = self.invoice_set.filter(
regie_id=regie_id, family_id=family_id, maelis_no_more_returned_date__isnull=True
)
qs.known_invoice_ids = known_invoice_ids
return qs
def make_no_online_payment_reason(self, qs, invoice):
if qs.known_invoice_ids is None:
# soap service is down
return 'Le service est temporairement indisponible.'
else:
return None
@endpoint(
display_category='Facture',
@ -4070,12 +4214,14 @@ class ToulouseMaelis(BaseResource, HTTPResource):
)
def invoices(self, request, regie_id, NameID=None, family_id=None):
family_id = family_id or self.get_link(NameID).family_id
invoices = [
i.format_content()
for i in self.get_invoices(family_id, regie_id)
if i.status() in ['created', 'for_payment']
]
return {'has_invoice_for_payment': True, 'data': invoices}
data = []
qs = self.get_invoices(family_id, regie_id)
for invoice in qs:
if invoice.status() not in ['created', 'for_payment']:
continue
invoice.no_online_payment_reason = self.make_no_online_payment_reason(qs, invoice)
data.append(invoice.format_content())
return {'has_invoice_for_payment': True, 'data': data}
@endpoint(
display_category='Facture',
@ -4096,15 +4242,17 @@ class ToulouseMaelis(BaseResource, HTTPResource):
for i in self.get_invoices(family_id, regie_id)
if i.status() in ['paid', 'notified']
]
return {'data': invoices}
return {'has_invoice_for_payment': True, 'data': invoices}
def get_invoice(self, regie_id, invoice_id):
real_invoice_id = invoice_id.split('-')[-1]
family_id = invoice_id[: -(len(real_invoice_id) + 1)]
qs = self.get_invoices(family_id, regie_id)
try:
invoice = self.get_invoices(family_id, regie_id).get(invoice_id=real_invoice_id)
invoice = qs.get(invoice_id=real_invoice_id)
except Invoice.DoesNotExist:
raise APIError('Invoice not found')
invoice.no_online_payment_reason = self.make_no_online_payment_reason(qs, invoice)
return invoice
@endpoint(
@ -4116,22 +4264,23 @@ class ToulouseMaelis(BaseResource, HTTPResource):
parameters={
'regie_id': {'description': 'Identifiant de la régie', 'example_value': '102'},
'invoice_id': {'description': 'Identifiant de facture', 'example_value': 'IDFAM-42'},
'for_payment': {
'payment': {
'description': "Si présent, annuler la facture panier à l'expiration du delai maximum de paiement depuis la date de l'appel"
},
},
)
def invoice(self, request, regie_id, invoice_id, for_payment=None, **kwargs):
def invoice(self, request, regie_id, invoice_id, payment=None, **kwargs):
invoice = self.get_invoice(regie_id, invoice_id)
if invoice.status() == 'cancelled':
raise APIError('Invoice cancelled')
if for_payment is not None:
if payment is not None and invoice.status() in ['created']:
invoice.start_payment_date = now()
invoice.save()
if invoice.status() == 'cancelling':
raise APIError('Invoice cancelling')
return {
'data': invoice.format_content(),
'data': invoice.format_content(no_online_payment_reason=invoice.no_online_payment_reason),
}
@endpoint(
@ -4283,6 +4432,9 @@ class Invoice(models.Model):
maelis_notification_date = models.DateTimeField(null=True)
basket_generation_date = models.DateTimeField(null=True)
maelis_cancel_notification_date = models.DateTimeField(null=True)
maelis_no_more_returned_date = models.DateTimeField(null=True)
no_online_payment_reason = None
def __repr__(self):
return '<Invoice "%s/%s">' % (self.regie_id, self.invoice_id)
@ -4311,10 +4463,13 @@ class Invoice(models.Model):
# hide invoice to Lingo
return 'cancelling'
if self.maelis_no_more_returned_date is not None:
# invoice cancelled by an agent into Maelis
return 'cancelled_by_agent'
# new invoice
return 'created'
def format_content(self):
def format_content(self, no_online_payment_reason=None):
item = self.maelis_data
paid = self.status() in ['paid', 'notified']
amount_paid = item['amountInvoice'] if paid else item['amountPaid']
@ -4328,15 +4483,21 @@ class Invoice(models.Model):
'amount_paid': amount_paid,
'label': item['libelleTTF'],
'has_pdf': bool(item['pdfName']),
'online_payment': True,
'online_payment': not bool(self.no_online_payment_reason),
'paid': paid,
'payment_date': None,
'no_online_payment_reason': None,
'no_online_payment_reason': self.no_online_payment_reason,
'reference_id': item['numInvoice'],
'maelis_item': item,
}
if paid or self.status() == 'for_payment':
invoice.update({'pay_limit_date': '', 'online_payment': False})
invoice.update(
{
'pay_limit_date': '',
'online_payment': False,
'no_online_payment_reason': None,
}
)
if self.status() == 'for_payment':
invoice['no_online_payment_reason'] = 'Transation de payement en cours'
return invoice

View File

@ -0,0 +1,11 @@
{% if user %}
{{ user.first_name }} {{ user.last_name }} <{{ user.email }}>
{% else %}
{{ link.name_id }}
{% endif %}
(lié le {{ link.created|date:"d/m/Y" }}
{% if user %}
; compte créé le {{ user.date_joined|date:"d/m/Y" }},
dernière connexion le {{ user.last_login|date:"d/m/Y" }}
{% endif %}
)

View File

@ -0,0 +1,167 @@
#!/usr/bin/python3
import argparse
import copy
import functools
import random
import statistics
import threading
import time
from multiprocessing import Lock, Pool, Process, Queue
from multiprocessing.sharedctypes import Value
import requests
# CONN = 'https://parsifal-passerelle.dev.publik.love/toulouse-maelis/integ-toulouse'
CONN = 'https://passerelle-parsifal.test.entrouvert.org/toulouse-maelis/maelis'
APIKEY = 'nicolas'
FAMILY_ID = '322423' # NICO TEST / UDAVE INTEG
PERSON_ID = '176658' # INTEG
duis = [str(i) for i in range(330120, 33151)]
def get_endpoint(args):
payload = None
if args.test == 'read-family':
url = args.conn + '/read-family?family_id=%s' % args.family
elif args.test == 'search-family':
url = args.conn + '/search-family?q=%s' % args.query
elif args.test == 'update-family':
payload = {
'category': 'BI',
'situation': 'VIEM',
'nbChild': '3',
'nbTotalChild': '4',
'nbAES': '1',
}
url = args.conn + '/update-family?family_id=%s' % args.family
elif args.test == 'person-catalog':
url = (
args.conn
+ '/get-person-activity-list?family_id=%s&person_id=%s&start_date=2022-09-01&end_date=2023-08-31'
% (
args.family,
args.person,
)
)
elif args.test == 'global-catalog':
url = args.conn + '/read-activity-list'
else:
raise Exception('unknown test')
url += '?apikey=%s' % APIKEY
return url, payload
def check(i, args):
url, payload = get_endpoint(args)
if args.test == 'read-family':
resp = requests.get(url)
elif args.test == 'search-family':
resp = requests.get(url)
elif args.test == 'update-family':
resp = requests.post(url, json=payload)
elif args.test == 'person-catalog':
resp = requests.get(url)
elif args.test == 'global-catalog':
resp = requests.get(url)
else:
raise Exception('unknown test')
resp.raise_for_status()
res = resp.json()
if res['err']:
raise Exception('API error: %s' % res['err_desc'])
return res
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', type=int, default=2, help='display errors')
parser.add_argument('--conn', '-e', default=CONN, help='url of local intance of maelis connector')
parser.add_argument('--number', '-n', type=int, default=300, help='number of requests')
parser.add_argument('--concurrency', '-c', type=int, default=100, help='number of parallel processes')
parser.add_argument('--test', '-t', default='read-family', help='WS to test')
parser.add_argument('family', help=FAMILY_ID, nargs='?', default=FAMILY_ID)
parser.add_argument(
'query', help='Recherche en texte intégral (plus ou moins)', nargs='?', default='SIMP'
)
parser.add_argument('--person', '-P', default=PERSON_ID, help='person id')
args = parser.parse_args()
done = 0
count = args.number
concurrency = args.concurrency
errors = 0
error_types = set()
durations = []
barrier = threading.Barrier(concurrency + 1)
done_lock = threading.Lock()
def f(i):
global done, durations, errors
barrier.wait()
while done < count:
with done_lock:
if done >= count:
break
current_done = done
done += 1
try:
start = time.time()
check(i, args)
duration = time.time() - start
durations.append(duration)
except Exception as e:
error_types.add(repr(e))
errors += 1
done_value = Value('i', 0, lock=True)
result_queue = Queue(count)
def target(result_queue, done_value):
while done_value.value < count:
with done_value.get_lock():
if done_value.value >= count:
break
done_value.value += 1
try:
start = time.time()
check(i, args)
duration = time.time() - start
result_queue.put((True, duration))
except Exception as e:
result_queue.put((False, repr(e)))
begin = time.time()
processes = []
for i in range(concurrency):
processes.append(Process(target=target, args=(result_queue, done_value)))
processes[-1].start()
while done < count:
ok, value = result_queue.get()
done += 1
print('Done %05d' % done, end='\r')
if ok:
durations.append(value)
else:
errors += 1
error_types.add(value)
print('Done %05d' % done, end='\r')
print()
for process in processes:
process.join()
print('Number of requests', count)
print('Concurrency', concurrency)
print('Errors', errors, 'on', count, 'types: ', list(error_types))
print('RPS', float(count - errors) / (time.time() - begin))
print('Min', min(durations))
print('Max', max(durations))
print('Average', statistics.fmean(durations))
print('Quantiles', statistics.quantiles(durations, n=10))

View File

@ -0,0 +1,28 @@
#!/usr/bin/python3
import argparse
import utils
def check(args):
utils.configure_logging(args.verbose)
client = utils.get_client(args.env, 'Family')
result = client.service.readFamilyListFromFullName(
fullname=args.query,
)
if args.verbose > 1:
print(result)
return result
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', type=int, default=2, help='display errors')
parser.add_argument('--env', '-e', default='integ', help='dev, test, integ, prod')
parser.add_argument(
'query', help='Recherche en texte intégral (plus ou moins)', nargs='?', default='TEST_'
)
args = parser.parse_args()
check(args)

View File

@ -0,0 +1,198 @@
#!/usr/bin/python3
import argparse
import copy
import functools
import random
import statistics
import threading
import time
from multiprocessing import Lock, Pool, Process, Queue
from multiprocessing.sharedctypes import Value
import utils
FAMILY_ID = '322423' # NICO TEST / UDAVE INTEG
PERSON_ID = '176658' # INTEG
duis = [str(i) for i in range(330120, 33151)]
client = None
# utils.configure_logging(0)
_client = None
def get_client_label(args):
if args.test in ['read-family', 'search-family', 'update-family']:
return 'Family'
elif args.test in ['person-catalog', 'global-catalog']:
return 'Activity'
else:
raise Exception('unknown test')
def check(client, i, args):
client = client or utils.get_client(args.env, get_client_label(args))
if args.test == 'read-family':
result = client.service.readFamily(dossierNumber=args.family)
elif args.test == 'search-family':
result = client.service.readFamilyListFromFullName(fullname=args.query)
elif args.test == 'update-family':
result = client.service.updateFamily(
dossierNumber=args.family,
category='BI',
situation='VIEM',
nbChild='1',
nbTotalChild='2',
nbAES='3',
)
elif args.test == 'person-catalog':
result = client.service.getPersonCatalogueActivity(
getPersonCatalogueActivityRequestBean={
'numDossier': args.family,
'numPerson': args.person,
'yearSchool': '2022',
'dateStartActivity': '2022-09-01',
'dateEndActivity': '2023-08-31',
}
)
elif args.test == 'global-catalog':
result = client.service.readActivityList(
schoolyear='1970',
dateStartCalend='2022-01-01',
dateEndCalend='2024-12-31',
)
else:
raise Exception('unknown test')
return result
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', type=int, default=2, help='display errors')
parser.add_argument('--env', '-e', default='integ', help='dev, test, integ, prod')
parser.add_argument('--reuse', '-r', default=False, help='reuse zeep client', action='store_true')
parser.add_argument('--number', '-n', type=int, default=300, help='number of requests')
parser.add_argument('--concurrency', '-c', type=int, default=100, help='number of parallel processes')
parser.add_argument('--test', '-t', default='read-family', help='WS to test')
parser.add_argument('family', help=FAMILY_ID, nargs='?', default=FAMILY_ID)
parser.add_argument(
'query', help='Recherche en texte intégral (plus ou moins)', nargs='?', default='SIMP'
)
parser.add_argument('--person', '-P', default=PERSON_ID, help='person id')
args = parser.parse_args()
if args.reuse:
_client = utils.get_client(args.env, get_client_label(args))
done = 0
count = args.number
concurrency = args.concurrency
errors = 0
error_types = set()
durations = []
barrier = threading.Barrier(concurrency + 1)
done_lock = threading.Lock()
def f(i):
global done, durations, errors
__client = None
if args.reuse:
__client = _client or utils.get_client(args.env, get_client_label(args))
barrier.wait()
while done < count:
with done_lock:
if done >= count:
break
current_done = done
done += 1
try:
start = time.time()
check(__client, i, args)
duration = time.time() - start
durations.append(duration)
except Exception as e:
error_types.add(repr(e))
errors += 1
if 1:
done_value = Value('i', 0, lock=True)
result_queue = Queue(count)
def target(result_queue, done_value):
_client = None
if args.reuse:
_client = utils.get_client(args.env, get_client_label(args))
while done_value.value < count:
with done_value.get_lock():
if done_value.value >= count:
break
done_value.value += 1
try:
start = time.time()
check(_client or utils.get_client(args.env, get_client_label(args)), i, args)
duration = time.time() - start
result_queue.put((True, duration))
except Exception as e:
result_queue.put((False, repr(e)))
begin = time.time()
processes = []
for i in range(concurrency):
processes.append(Process(target=target, args=(result_queue, done_value)))
processes[-1].start()
while done < count:
ok, value = result_queue.get()
done += 1
print('Done %05d' % done, end='\r')
if ok:
durations.append(value)
else:
errors += 1
error_types.add(value)
print('Done %05d' % done, end='\r')
print()
for process in processes:
process.join()
else:
# Refait le script avec multiprocessing plutôt que threading
# pour être plus proche des processus uwsgi
# (la création des zeep.Client coûte parce qu'un seul thread peut le faire à la fois)
print('obsolete')
threads = [threading.Thread(target=f, args=(i,)) for i in range(concurrency)]
for thread in threads:
thread.start()
barrier.wait()
begin = time.time()
while done != count:
print('Done %05d' % done, end='\r')
time.sleep(0.5)
print('Done %05d' % done, end='\r')
for thread in threads:
thread.join()
print()
print('Number of requests', count)
print('Concurrency', concurrency)
print('Errors', errors, 'on', count, 'types: ', list(error_types))
print('RPS', float(count - errors) / (time.time() - begin))
print('Min', min(durations))
print('Max', max(durations))
print('Average', statistics.fmean(durations))
print('Quantiles', statistics.quantiles(durations, n=10))

View File

@ -0,0 +1,33 @@
#!/usr/bin/python3
import argparse
import utils
FAMILY_ID = '322423' # NICO
def check(args):
utils.configure_logging(args.verbose)
client = utils.get_client(args.env, 'Family')
result = client.service.updateFamily(
dossierNumber=args.family,
category='BI',
situation='VIEM',
nbChild='1',
nbTotalChild='2',
nbAES='3',
)
if args.verbose > 1:
print(result)
return result
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', type=int, default=2, help='display errors')
parser.add_argument('--env', '-e', default='integ', help='dev, test, integ, prod')
parser.add_argument('family', help=FAMILY_ID, nargs='?', default=FAMILY_ID)
args = parser.parse_args()
check(args)

View File

@ -8,6 +8,7 @@ import requests
import zeep
from django.core.serializers.json import DjangoJSONEncoder
from lxml import etree
from zeep.cache import InMemoryCache
from zeep.transports import Transport
from zeep.wsse.username import UsernameToken
@ -105,6 +106,9 @@ def get_wsdl_url(env, service):
return config['url'] + 'services/' + service.title() + 'Service?wsdl'
zeep_cache = InMemoryCache()
def get_client(env, service):
config = load_config(env)
settings = zeep.Settings(strict=False, xsd_ignore_sequence_order=True)
@ -112,9 +116,8 @@ def get_client(env, service):
session = requests.Session()
session.verify = config['session_verify']
wsdl_url = get_wsdl_url(env, service)
print(wsdl_url)
transport = Transport(session=session)
transport = Transport(session=session, cache=zeep_cache)
client = zeep.Client(wsdl_url, transport=transport, wsse=wsse, settings=settings)
return client

View File

@ -19,7 +19,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.school.ws.maelis.sigec.com" xmlns:ns1="activity.ws.maelis.sigec.com" targetNamespace="bean.persistence.school.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.school.ws.maelis.sigec.com" xmlns:ns1="activity.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.school.ws.maelis.sigec.com">
<xs:import namespace="activity.ws.maelis.sigec.com"/>
<xs:complexType name="weeklyCalendarActivityBean">
<xs:sequence>
@ -68,7 +68,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ws.maelis.sigec.com" targetNamespace="bean.persistence.invoice.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.invoice.ws.maelis.sigec.com">
<xs:import namespace="bean.persistence.ws.maelis.sigec.com"/>
<xs:complexType name="invoiceBean">
<xs:sequence>
@ -117,7 +117,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="activity.ws.maelis.sigec.com" targetNamespace="bean.persistence.ape.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="activity.ws.maelis.sigec.com" xmlns="bean.persistence.ape.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.ape.ws.maelis.sigec.com">
<xs:import namespace="activity.ws.maelis.sigec.com"/>
<xs:complexType name="indicatorValueAddUpdBean">
<xs:complexContent>
@ -132,7 +132,7 @@
</xs:complexContent>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.activity.ws.maelis.sigec.com" xmlns:ns5="bean.persistence.ws.maelis.sigec.com" xmlns:ns4="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns3="bean.persistence.school.ws.maelis.sigec.com" xmlns:ns2="activity.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ape.ws.maelis.sigec.com" targetNamespace="bean.persistence.activity.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.activity.ws.maelis.sigec.com" xmlns:ns5="bean.persistence.ws.maelis.sigec.com" xmlns:ns4="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns3="bean.persistence.school.ws.maelis.sigec.com" xmlns:ns2="activity.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ape.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.activity.ws.maelis.sigec.com">
<xs:import namespace="bean.persistence.ape.ws.maelis.sigec.com"/>
<xs:import namespace="activity.ws.maelis.sigec.com"/>
<xs:import namespace="bean.persistence.school.ws.maelis.sigec.com"/>
@ -1169,10 +1169,10 @@
</xs:extension>
</xs:complexContent>
</xs:complexType>
<xs:complexType name="activityUnitPlace2KernelBean">
<xs:complexType name="codeLabelKernelBean">
<xs:sequence>
<xs:element minOccurs="0" name="activityKernel" type="xs:anyType"/>
<xs:element maxOccurs="unbounded" minOccurs="0" name="unitKernelList" nillable="true" type="xs:anyType"/>
<xs:element minOccurs="0" name="code" type="xs:string"/>
<xs:element minOccurs="0" name="label" type="xs:string"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="consoTarifKernelBean">
@ -1182,10 +1182,10 @@
<xs:element minOccurs="0" name="tarifKernelBean" type="tns:codeLabelKernelBean"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="codeLabelKernelBean">
<xs:complexType name="activityUnitPlace2KernelBean">
<xs:sequence>
<xs:element minOccurs="0" name="code" type="xs:string"/>
<xs:element minOccurs="0" name="label" type="xs:string"/>
<xs:element minOccurs="0" name="activityKernel" type="xs:anyType"/>
<xs:element maxOccurs="unbounded" minOccurs="0" name="unitKernelList" nillable="true" type="xs:anyType"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="activityPeriodCapacityKernelBean">

View File

@ -9,7 +9,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.ape.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="ape.ws.maelis.sigec.com" targetNamespace="bean.persistence.ape.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.ape.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="ape.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.ape.ws.maelis.sigec.com">
<xs:import namespace="ape.ws.maelis.sigec.com"/>
<xs:import namespace="bean.persistence.ws.maelis.sigec.com"/>
<xs:complexType name="addApeBookRequestBean">
@ -158,6 +158,7 @@
<xs:sequence>
<xs:element minOccurs="0" name="activityType" type="tns:activityTypeBean"/>
<xs:element minOccurs="0" name="idActivity" type="xs:string"/>
<xs:element minOccurs="0" name="idService" type="xs:string"/>
<xs:element minOccurs="0" name="libelle" type="xs:string"/>
<xs:element minOccurs="0" name="libelle2" type="xs:string"/>
<xs:element minOccurs="0" name="manager1" type="tns:managerBean"/>

View File

@ -820,13 +820,6 @@
<xs:element maxOccurs="unbounded" minOccurs="0" name="streetList" type="ns1:StreetBean"/>
</xs:sequence>
</xs:complexType>
<xs:simpleType name="indicatorTypeDescEnum">
<xs:restriction base="xs:string">
<xs:enumeration value="NONE"/>
<xs:enumeration value="NOTE"/>
<xs:enumeration value="CHOICE"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="blocNoteTypeEnum">
<xs:restriction base="xs:string">
<xs:enumeration value="A"/>
@ -836,6 +829,22 @@
<xs:enumeration value="ID"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="indicatorTypeDescEnum">
<xs:restriction base="xs:string">
<xs:enumeration value="NONE"/>
<xs:enumeration value="NOTE"/>
<xs:enumeration value="CHOICE"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="subscribeAction">
<xs:restriction base="xs:string">
<xs:enumeration value="ADD_SUBSCRIBE"/>
<xs:enumeration value="DELETE_SUBSCRIBE"/>
<xs:enumeration value="UPDATE_SUBSCRIBE_DOSS"/>
<xs:enumeration value="ADD_DEROG"/>
<xs:enumeration value="COMPLETE_DEROG_DOSS"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="levelCode">
<xs:restriction base="xs:string">
<xs:enumeration value="SCHOOL"/>
@ -853,15 +862,6 @@
<xs:enumeration value="PERSON"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="subscribeAction">
<xs:restriction base="xs:string">
<xs:enumeration value="ADD_SUBSCRIBE"/>
<xs:enumeration value="DELETE_SUBSCRIBE"/>
<xs:enumeration value="UPDATE_SUBSCRIBE_DOSS"/>
<xs:enumeration value="ADD_DEROG"/>
<xs:enumeration value="COMPLETE_DEROG_DOSS"/>
</xs:restriction>
</xs:simpleType>
<xs:element name="MaelisFamilyException" type="tns:MaelisFamilyException"/>
<xs:complexType name="MaelisFamilyException">
<xs:sequence>
@ -870,7 +870,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="family.ws.maelis.sigec.com" targetNamespace="bean.persistence.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="family.ws.maelis.sigec.com" xmlns="bean.persistence.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.ws.maelis.sigec.com">
<xs:import namespace="family.ws.maelis.sigec.com"/>
<xs:complexType name="itemBean">
<xs:sequence>
@ -908,7 +908,7 @@
</xs:complexContent>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="bean.persistence.family.ws.maelis.sigec.com" targetNamespace="bean.persistence.site.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ns1="bean.persistence.family.ws.maelis.sigec.com" xmlns="bean.persistence.site.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.site.ws.maelis.sigec.com">
<xs:import namespace="bean.persistence.family.ws.maelis.sigec.com"/>
<xs:complexType name="subscribeSchoolYearBean">
<xs:sequence>
@ -916,7 +916,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.family.ws.maelis.sigec.com" xmlns:ns3="bean.persistence.site.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="family.ws.maelis.sigec.com" targetNamespace="bean.persistence.family.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.family.ws.maelis.sigec.com" xmlns:ns3="bean.persistence.site.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="family.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.family.ws.maelis.sigec.com">
<xs:import namespace="family.ws.maelis.sigec.com"/>
<xs:import namespace="bean.persistence.ws.maelis.sigec.com"/>
<xs:import namespace="bean.persistence.site.ws.maelis.sigec.com"/>

View File

@ -152,7 +152,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="ws.maelis.sigec.com" targetNamespace="bean.persistence.invoice.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.invoice.ws.maelis.sigec.com" xmlns:ns2="bean.persistence.ws.maelis.sigec.com" xmlns:ns1="ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.invoice.ws.maelis.sigec.com">
<xs:import namespace="ws.maelis.sigec.com"/>
<xs:import namespace="bean.persistence.ws.maelis.sigec.com"/>
<xs:complexType name="personBankBean">

View File

@ -97,7 +97,7 @@
</xs:sequence>
</xs:complexType>
</xs:schema>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.site.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ws.maelis.sigec.com" targetNamespace="bean.persistence.site.ws.maelis.sigec.com" version="1.0">
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns="bean.persistence.site.ws.maelis.sigec.com" xmlns:ns1="bean.persistence.ws.maelis.sigec.com" attributeFormDefault="unqualified" elementFormDefault="unqualified" targetNamespace="bean.persistence.site.ws.maelis.sigec.com">
<xs:import namespace="bean.persistence.ws.maelis.sigec.com"/>
<xs:complexType name="readSchoolForAdressAndLevelRequestBean">
<xs:sequence>

View File

@ -17,6 +17,8 @@ from math import inf
from dateutil.relativedelta import relativedelta
from passerelle.utils.jsonresponse import APIError
json_date_format = '%Y-%m-%d'
@ -47,3 +49,12 @@ def get_public_criterias(today, start_dob, end_dob):
data.append((str(i), publics_txt[i]))
break
return data
def strtobool(val):
val = val.lower()
if val in ('y', 'yes', 't', 'true', 'on', '1'):
return True
elif val in ('n', 'no', 'f', 'false', 'off', '0'):
return False
raise APIError('invalid truth value %r' % val)

View File

@ -177,6 +177,12 @@ class ToulouseSmartResource(BaseResource, HTTPResource):
except (KeyError, TypeError):
block = {}
data = {}
if not isinstance(block, dict):
raise APIError(
"cannot retrieve '%s' block content from post data: got a %s where a dict was expected"
% (wcs_block_varname, type(block)),
http_status=400,
)
cast = {'string': str, 'int': int, 'boolean': bool, 'item': str}
for prop in intervention_type.get('properties') or []:
varname = slugify(prop['name']).replace('-', '_')

View File

@ -7,8 +7,8 @@ msgid ""
msgstr ""
"Project-Id-Version: Passerelle 0\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2023-09-19 18:00+0200\n"
"PO-Revision-Date: 2023-07-06 18:06+0200\n"
"POT-Creation-Date: 2023-11-22 21:24+0100\n"
"PO-Revision-Date: 2023-11-22 21:24+0100\n"
"Last-Translator: Frederic Peters <fpeters@entrouvert.com>\n"
"Language: fr\n"
"MIME-Version: 1.0\n"
@ -101,15 +101,16 @@ msgstr "Jeton daccès aux API (API token)"
#: apps/atos_genesys/models.py apps/bbb/models.py apps/bdp/models.py
#: apps/clicrdv/models.py apps/esabora/models.py apps/esirius/models.py
#: apps/esup_signature/models.py apps/family/models.py apps/gdc/models.py
#: apps/gesbac/models.py apps/litteralis/models.py apps/okina/models.py
#: apps/signal_arretes/models.py apps/sne/models.py apps/soap/models.py
#: apps/solis/models.py apps/vivaticket/models.py
#: apps/gesbac/models.py apps/litteralis/models.py apps/matrix42/models.py
#: apps/okina/models.py apps/signal_arretes/models.py apps/sne/models.py
#: apps/soap/models.py apps/solis/models.py apps/vivaticket/models.py
#: contrib/caluire_axel/models.py contrib/dpark/models.py
#: contrib/esirius_swi/models.py contrib/fake_family/models.py
#: contrib/gdema/models.py contrib/greco/models.py
#: contrib/grenoble_gru/models.py contrib/isere_ens/models.py
#: contrib/iws/models.py contrib/lille_kimoce/models.py
#: contrib/mdph13/models.py contrib/planitech/models.py contrib/rsa13/models.py
#: contrib/isere_esrh/models.py contrib/iws/models.py
#: contrib/lille_kimoce/models.py contrib/mdph13/models.py
#: contrib/planitech/models.py contrib/rsa13/models.py
#: contrib/sigerly/models.py contrib/solis_afi_mss/models.py
#: contrib/solis_apa/models.py contrib/teamnet_axel/models.py
#: contrib/toulouse_axel/models.py contrib/toulouse_foederis/models.py
@ -165,7 +166,7 @@ msgstr "Nom de laction"
#: apps/airquality/models.py apps/cartads_cs/models.py apps/cryptor/models.py
#: apps/holidays/models.py apps/ldap/models.py apps/pdf/models.py
#: apps/proxy/models.py contrib/strasbourg_eu/models.py
#: apps/proxy/models.py apps/qrcode/models.py contrib/strasbourg_eu/models.py
#: templates/passerelle/manage/service_view.html
msgid "Misc"
msgstr "Divers"
@ -441,7 +442,7 @@ msgid "Accessible scopes list"
msgstr "Liste des domaines (« scopes ») accessibles"
#: apps/arcgis/models.py apps/arpege_ecp/models.py apps/astech/models.py
#: apps/atos_genesys/models.py apps/filr_rest/models.py
#: apps/atos_genesys/models.py apps/filr_rest/models.py apps/matrix42/models.py
#: contrib/isere_ens/models.py contrib/mdph13/models.py contrib/rsa13/models.py
#: contrib/sigerly/models.py contrib/toulouse_foederis/models.py
#: contrib/toulouse_smart/models.py
@ -658,6 +659,11 @@ msgstr "Demande"
msgid "AS-TECH"
msgstr "AS-TECH"
#: apps/astech/models.py
#, python-format
msgid "Value of %s exceeds authorized length (%s)"
msgstr "La valeur de %s dépasse la longueur autorisée (%s)"
#: apps/astech/models.py
msgid "See all possible connections codes (see configuration)"
msgstr "Voir tous les codes de connexion possibles (pour la configuration)"
@ -744,6 +750,39 @@ msgstr "Identifiant de la demande"
msgid "List all demand possible positions"
msgstr "Lister toutes les positions possibles pour une demande"
#: apps/astech/models.py
msgid "List available views"
msgstr "Lister les vues disponibles"
#: apps/astech/models.py
#: contrib/solis_apa/templates/passerelle/contrib/solis_apa/detail.html
msgid "Referential"
msgstr "Référentiels"
#: apps/astech/models.py
msgid "Get view columns"
msgstr "Obtenir les colonnes dune vue"
#: apps/astech/models.py
msgid "View code"
msgstr "Code de la vue"
#: apps/astech/models.py
msgid "Get view data"
msgstr "Obtenir les données dune vue"
#: apps/astech/models.py
msgid "Name of column contaning the id"
msgstr "Nom de la colonne contenant lidentifiant"
#: apps/astech/models.py
msgid "Name of column contaning the label"
msgstr "Nom de la colonne contenant le libellé"
#: apps/astech/models.py
msgid "Semicolon separated filter expressions"
msgstr "Expressions de filtre séparées par des point-virgules"
#: apps/astre_rest/models.py apps/astregs/models.py
msgid "Organisme"
msgstr "Organisme"
@ -1003,6 +1042,10 @@ msgstr "Ajouter des pièces jointes à des demandes de travaux"
msgid "Get the status of a works request"
msgstr "Récupérer le statut dune demande de travaux"
#: apps/atal_rest/models.py
msgid "Get the status of a works request intervention"
msgstr "Récupérer le statut davancement des travaux"
#: apps/atos_genesys/models.py
msgid "Code RGP"
msgstr "Code RG"
@ -1439,7 +1482,7 @@ msgstr "Clé secrète"
msgid "User full name"
msgstr "Nom complet de lutilisateur"
#: apps/bbb/models.py contrib/toulouse_foederis/models.py
#: apps/bbb/models.py apps/qrcode/models.py contrib/toulouse_foederis/models.py
msgid "Created"
msgstr "Créé"
@ -1447,7 +1490,7 @@ msgstr "Créé"
msgid "Updated"
msgstr "Mise à jour"
#: apps/bbb/models.py apps/franceconnect_data/models.py
#: apps/bbb/models.py apps/franceconnect_data/models.py apps/qrcode/models.py
msgid "UUID"
msgstr "UUID"
@ -3012,6 +3055,75 @@ msgstr "Récupérer le statut dune demande"
msgid "Get submission decree"
msgstr "Récupérer lacte dune demande"
#: apps/matrix42/models.py
msgid "Matrix42 Public API"
msgstr "Matrix42 API Publique"
#: apps/matrix42/models.py
msgid "Example: https://xxx.m42cloud.com/m42Services/api/"
msgstr "Exemple : https://xxx.m42cloud.com/m42Services/api/"
#: apps/matrix42/models.py
msgid "Authorization Token"
msgstr "Jeton dauthentification"
#: apps/matrix42/models.py
msgid "Fragment Query"
msgstr "Requête sur fragments"
#: apps/matrix42/models.py
msgid "Fragments"
msgstr "Fragments (fragments)"
#: apps/matrix42/models.py
msgid "Technical name of the Data Definition"
msgstr "Nom technique de la définition de donnée"
#: apps/matrix42/models.py
msgid "Columns in the result set, separated by comma"
msgstr "Colonnes dans le résultat, séparées par des virgules"
#: apps/matrix42/models.py
msgid "Filter: \"WHERE filter\""
msgstr "Filtre : « WHERE filter »"
#: apps/matrix42/models.py
msgid ""
"Django template for text attribute - if none, use DisplayString|DisplayName|"
"Name"
msgstr ""
"Gabarit Django pour lattribut « text ». Si absent, utilise DisplayString, "
"ou DisplayName, ou Name"
#: apps/matrix42/models.py
msgid "Django template for id attribute - if none, use ID"
msgstr "Gabarit Django pour lattribut « id ». Si absent, utilise ID"
#: apps/matrix42/models.py
msgid "Search column: \"WHERE search_column LIKE '%q%' (AND filter)\""
msgstr ""
"Colonne de recherche : « WHERE search_column LIKE '%q%' ( AND filter ) »"
#: apps/matrix42/models.py
msgid "Search text (needs a search_column)"
msgstr "Texte à chercher (nécessite de préciser la colonne avec search_column)"
#: apps/matrix42/models.py
msgid "Get the whole fragment with this ID"
msgstr "Récupérer le fragment complet pour cet ID"
#: apps/matrix42/models.py
msgid "Get an object"
msgstr "Récupérer un objet"
#: apps/matrix42/models.py
msgid "Objects"
msgstr "Objets (objects)"
#: apps/matrix42/models.py
msgid "Create an new object"
msgstr "Créer un objet"
#: apps/mdel/models.py
msgid "SFTP server for outgoing files"
msgstr "Serveur SFTP pour les fichiers sortants"
@ -4038,6 +4150,111 @@ msgstr "Envoyer une requête"
msgid "request will be made on Upstream Service Base URL + path"
msgstr "la requête sera envoyée sur « URL de base système + path »"
#: apps/qrcode/models.py
msgid "Data to encode in the certificate"
msgstr "Donnée à émettre dans le certificat (un dictionnaire)"
#: apps/qrcode/models.py
msgid "Private Key"
msgstr "Clé privée"
#: apps/qrcode/models.py
msgid "QR Code"
msgstr "Code QR"
#: apps/qrcode/models.py
msgid "Create or update a certificate"
msgstr "Créer ou mettre à jour un certificat"
#: apps/qrcode/models.py
msgid "Certificate identifier"
msgstr "Identifiant du certificat"
#: apps/qrcode/models.py
msgid "Retrieve an existing certificate"
msgstr "Récupérer un certificat existant"
#: apps/qrcode/models.py
msgid "Get QR Code"
msgstr "Obtenir limage du code QR (format PNG)"
#: apps/qrcode/models.py
msgid "Create or update a qrcode reader"
msgstr "Créer ou mettre à jour un accès au lecteur de code QR"
#: apps/qrcode/models.py
msgid "QRCode reader identifier"
msgstr "Identifiant de laccès au lecteur de code QR"
#: apps/qrcode/models.py
msgid "Get informations about a QRCode reader"
msgstr "Obtenir les informations sur laccès au lecteur de code QR"
#: apps/qrcode/models.py
msgid "Open a QRCode reader page."
msgstr "Obtenir l'URL du lecteur de code QR"
#: apps/qrcode/models.py
msgid "Last modification"
msgstr "Dernière modification"
#: apps/qrcode/models.py
msgid "Validity Start Date"
msgstr "Date de début de validité"
#: apps/qrcode/models.py
msgid "Validity End Date"
msgstr "Date de fin de validité"
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "Close"
msgstr "Fermer"
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "QR code Expired"
msgstr "Le certificat est expiré."
#: apps/qrcode/templates/qrcode/qrcode-reader.html sms/forms.py
msgid "From"
msgstr "De"
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "This QR code isn't supported by this application."
msgstr "Ce code QR n'est pas supporté par cette application."
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "Signature verification failed."
msgstr "Échec de la vérification de la signature du certificat."
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid ""
"QR code reader isn\\'t supported on your platform. Please update your "
"browser."
msgstr ""
"Le lecteur de code QR n'est pas supporté sur votre plateforme. Essayez de "
"mettre à jour votre navigateur ou d'en changer."
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "QR code not yet valid"
msgstr "Le certificat n'est pas encore valide."
#: apps/qrcode/templates/qrcode/qrcode-reader.html sms/forms.py
msgid "To"
msgstr "À"
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "Valid QR code"
msgstr "Code QR et certificat valide"
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "Reader isn't usable yet."
msgstr ""
"Vous n'êtes pas encore autorisé à lire les codes QR, veuillez attendre."
#: apps/qrcode/templates/qrcode/qrcode-reader.html
msgid "Reader has expired."
msgstr "La période d'autorisation de votre lecteur de code QR est terminée."
#: apps/sector/models.py
msgid "all"
msgstr "tout"
@ -4177,7 +4394,6 @@ msgid "Minimal house number may not be lesser than maximal house number."
msgstr "Le numéro minimal ne peut pas être inférieur au numéro maximal."
#: apps/sendethic/models.py
#| msgid "Account"
msgid "Account ID"
msgstr "ID du compte"
@ -4278,6 +4494,7 @@ msgid "Comment"
msgstr "Commentaire :"
#: apps/signal_arretes/models.py contrib/esirius_swi/models.py
#: contrib/isere_esrh/models.py
msgid "Base API URL"
msgstr "URL de base"
@ -5563,6 +5780,42 @@ msgstr "Annuler une réservation"
msgid "External ID"
msgstr "ID externe"
#: contrib/isere_esrh/models.py
msgid "ESRH Isère"
msgstr ""
#: contrib/isere_esrh/models.py
msgid "Get official informations"
msgstr "Récupérer les informations officielles"
#: contrib/isere_esrh/models.py
msgid "Official registration number"
msgstr "Numéro d'enregistrement officiel"
#: contrib/isere_esrh/models.py
msgid "Public authority"
msgstr "Autorité publique"
#: contrib/isere_esrh/models.py
msgid "Get entities"
msgstr "Lister les entités"
#: contrib/isere_esrh/models.py
msgid "Filter entities whose label matches this regex (case insensitive)"
msgstr ""
"Filtres les entités dont le label correspond à cette expression régulière "
"(en ignorant la casse)"
#: contrib/isere_esrh/models.py
msgid "Filter entities whose code matches this regex (case insensitive)"
msgstr ""
"Filtres les entités dont le code correspond à cette expression régulière (en "
"ignorant la casse)"
#: contrib/isere_esrh/models.py
msgid "Get job types"
msgstr "Lister les types d'emploi"
#: contrib/iws/models.py
msgid "URL of SOAP operation endpoint"
msgstr "URL du point daccès des opérations SOAP"
@ -6166,10 +6419,6 @@ msgstr "Solis™ (obsolète)"
msgid "Import Demand"
msgstr "Import dune demande"
#: contrib/solis_apa/templates/passerelle/contrib/solis_apa/detail.html
msgid "Referential"
msgstr "Référentiels"
#: contrib/solis_apa/views.py
msgid "Unknown suivi type"
msgstr "Type de suivi inconnu"
@ -6479,6 +6728,10 @@ msgstr "Nom du fichier"
msgid "MIME type"
msgstr "type MIME du contenu"
#: contrib/toulouse_foederis/models.py
msgid "ID of an element of the data source 'niveau-diplome'."
msgstr "Identifiant dun élément de la source de données « niveau-diplome »."
#: contrib/toulouse_foederis/models.py
msgid "Application Type (External or Internal)."
msgstr "Type de candidature (Externe ou Interne)."
@ -6612,8 +6865,8 @@ msgid "RGPD agreement."
msgstr "Accord RGPD."
#: contrib/toulouse_foederis/models.py
msgid "IDs of elements of the data source 'type-emploi'."
msgstr "Identifiants déléments de la source de données « type-emploi »."
msgid "Wanted job types"
msgstr "Types d'emploi souhaités"
#: contrib/toulouse_foederis/models.py
msgid "IDs of elements of the data source 'domaine-emploi'."
@ -6648,10 +6901,6 @@ msgstr "Nom du diplôme du candidat stagiaire."
msgid "Candidate trainee's diploma speciality."
msgstr "Spécialité du diplôme du candidat stagiaire."
#: contrib/toulouse_foederis/models.py
msgid "ID of an element of the data source 'niveau-diplome'."
msgstr "Identifiant dun élément de la source de données « niveau-diplome »."
#: contrib/toulouse_foederis/models.py
msgid "Candidate trainee's last obtained diploma."
msgstr "Dernier diplôme obtenu par le candidat stagiaire."
@ -6870,14 +7119,6 @@ msgstr "Fichiers export de blocs pour w.c.s."
msgid "All levels"
msgstr "Tous les niveaux"
#: sms/forms.py
msgid "To"
msgstr "À"
#: sms/forms.py
msgid "From"
msgstr "De"
#: sms/forms.py templates/passerelle/includes/resource-logs-table.html
#: templates/passerelle/manage/log.html
msgid "Message"
@ -7005,8 +7246,7 @@ msgstr ""
#: sms/templates/sms/credit_alert_body.html
#: sms/templates/sms/credit_alert_body.txt
msgid "Please add more credit as soon as possible for your account."
msgstr ""
"Merci dajouter du crédit sur votre compte dès que possible."
msgstr "Merci dajouter du crédit sur votre compte dès que possible."
#: sms/templates/sms/credit_alert_body.html
msgid "View connector page"

View File

@ -77,6 +77,7 @@ MIDDLEWARE = (
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'passerelle.utils.defer.run_later_middleware',
)
ROOT_URLCONF = 'passerelle.urls'
@ -162,6 +163,7 @@ INSTALLED_APPS = (
'passerelle.apps.jsondatastore',
'passerelle.apps.ldap',
'passerelle.apps.litteralis',
'passerelle.apps.matrix42',
'passerelle.apps.mdel',
'passerelle.apps.mdel_ddpacs',
'passerelle.apps.mobyt',
@ -187,6 +189,7 @@ INSTALLED_APPS = (
'passerelle.apps.twilio',
'passerelle.apps.vivaticket',
'passerelle.apps.sendethic',
'passerelle.apps.qrcode',
# backoffice templates and static
'gadjo',
)
@ -195,6 +198,7 @@ INSTALLED_APPS = (
PASSERELLE_APP_BDP_ENABLED = False
PASSERELLE_APP_GDC_ENABLED = False
PASSERELLE_APP_STRASBOURG_EU_ENABLED = False
PASSERELLE_APP_TOULOUSE_MAELIS_ENABLED = False
# mark some apps as legacy
PASSERELLE_APP_CLICRDV_LEGACY = True
@ -204,6 +208,9 @@ PASSERELLE_APP_SOLIS_APA_LEGACY = True
PDFTK_PATH = '/usr/bin/pdftk'
PDFTK_TIMEOUT = 20
# passerelle.apps.opengis configuration
OPENGIS_SKIPPED_NAMESPACES = ['http://www.opengis.net/wfs', 'http://www.opengis.net/gml']
# Authentication settings
try:
import mellon
@ -280,6 +287,9 @@ CONNECTORS_SETTINGS = {}
# List of authorized content-types, as regular expressions, for substitutions
REQUESTS_SUBSTITUTIONS_CONTENT_TYPES = [r'text/.*', r'application/(.*\+)?json', r'application/(.*\+)?xml']
# List of hosntames where certificate errors should be ignored
REQUESTS_IGNORE_HTTPS_CERTIFICATE_ERRORS = []
# Passerelle can receive big requests (for example base64 encoded files)
DATA_UPLOAD_MAX_MEMORY_SIZE = 100 * 1024 * 1024

View File

@ -366,11 +366,10 @@ class Request(RequestSession):
# search in legacy urls
legacy_urls_mapping = getattr(settings, 'LEGACY_URLS_MAPPING', None)
if legacy_urls_mapping:
splitted_url = urllib.parse.urlparse(url)
hostname = splitted_url.netloc
if hostname in legacy_urls_mapping:
url = splitted_url._replace(netloc=legacy_urls_mapping[hostname]).geturl()
splitted_url = urllib.parse.urlparse(url)
hostname = splitted_url.netloc
if legacy_urls_mapping and hostname in legacy_urls_mapping:
url = splitted_url._replace(netloc=legacy_urls_mapping[hostname]).geturl()
if self.resource:
if 'auth' not in kwargs:
@ -394,6 +393,9 @@ class Request(RequestSession):
if proxy:
kwargs['proxies'] = {'http': proxy, 'https': proxy}
if hostname in settings.REQUESTS_IGNORE_HTTPS_CERTIFICATE_ERRORS:
kwargs['verify'] = False
if method == 'GET' and cache_duration:
cache_key = hashlib.md5(force_bytes('%r;%r' % (url, kwargs))).hexdigest()
cache_content = cache.get(cache_key)

96
passerelle/utils/defer.py Normal file
View File

@ -0,0 +1,96 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import contextlib
from contextvars import ContextVar
from django.core.management import BaseCommand
from django.db import connection
# See https://docs.python.org/3/library/contextvars.html
# ContextVar are concurrency-safe variables, they are thread safe (like
# threading.local()) and coroutine (asyncio) safe.
run_later_context: ContextVar[list] = ContextVar('run_later_context')
def is_in_transaction():
return getattr(connection, 'in_atomic_block', False)
@contextlib.contextmanager
def run_later_scope():
try:
run_later_context.get()
except LookupError:
callbacks = []
token = run_later_context.set(callbacks)
try:
yield
finally:
run_later_context.reset(token)
for func, args, kwargs in callbacks:
func(*args, **kwargs)
else:
# nested scopes have not effect, callbacks will always be called by the
# most enclosing scope, i.e. in this case:
# with run_later_scope():
# with run_later_scope():
# run_later(f)
# (1)
# ..other statements..
# (2)
#
# the function will be called at point (2), not (1)
yield
def run_later(func, *args, **kwargs):
try:
callbacks = run_later_context.get()
except LookupError:
# no scope, run immediately
return func(*args, **kwargs)
else:
callbacks.append((func, args, kwargs))
return None
def run_later_if_in_transaction(func, *args, **kwargs):
if is_in_transaction():
return run_later(func, *args, **kwargs)
else:
return func(*args, **kwargs)
class run_later_middleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
with run_later_scope():
return self.get_response(request)
# monkeypatch BaseCommand execute to provide the same service to commands
old_BaseCommand_execute = BaseCommand.execute
def BaseCommand_execute(self, *args, **kwargs):
with run_later_scope():
return old_BaseCommand_execute(self, *args, **kwargs)
BaseCommand.execute = BaseCommand_execute

View File

@ -474,16 +474,18 @@ class GenericEndpointView(GenericConnectorMixin, SingleObjectMixin, View):
connector_name, endpoint_name = kwargs['connector'], kwargs['endpoint']
url = request.get_full_path()
logger_extra = {
'request': request,
'connector': connector_name,
'connector_endpoint': endpoint_name,
'connector_endpoint_url': url,
'publik_caller_url': request.headers.get('Publik-Caller-URL', ''),
}
if request.method.lower() not in self.endpoint.endpoint_info.methods:
logger_extra['connector_endpoint_method'] = self._allowed_methods()
self.connector.logger.warning(
'endpoint %s %s (=> 405)' % (request.method, url),
extra={
'request': request,
'connector': connector_name,
'connector_endpoint': endpoint_name,
'connector_endpoint_method': self._allowed_methods(),
'connector_endpoint_url': url,
},
extra=logger_extra,
)
return self.http_method_not_allowed(request, *args, **kwargs)
@ -496,16 +498,15 @@ class GenericEndpointView(GenericConnectorMixin, SingleObjectMixin, View):
except UnicodeDecodeError:
payload = '<BINARY PAYLOAD>'
logger_extra.update(
{
'connector_endpoint_method': request.method,
'connector_payload': payload,
}
)
self.connector.logger.info(
'endpoint %s %s (%r) ' % (request.method, url, payload),
extra={
'request': request,
'connector': connector_name,
'connector_endpoint': endpoint_name,
'connector_endpoint_method': request.method,
'connector_endpoint_url': url,
'connector_payload': payload,
},
extra=logger_extra,
)
if not self.check_perms(request):

6
setup-vitest.sh Normal file
View File

@ -0,0 +1,6 @@
#!/bin/bash
pip install $*
nodeenv --prebuilt --python-virtualenv
source $VIRTUAL_ENV/bin/activate # source again to activate npm from env
npm install -g vitest happy-dom

View File

@ -46,7 +46,7 @@ def get_version():
real_number, commit_count, commit_hash = result.split('-', 2)
version = '%s.post%s+%s' % (real_number, commit_count, commit_hash)
else:
version = result
version = result.replace('.dirty', '+dirty')
return version
else:
return '0.0.post%s' % len(subprocess.check_output(['git', 'rev-list', 'HEAD']).splitlines())
@ -175,6 +175,9 @@ setup(
'cryptography',
'xmltodict',
'phonenumbers',
'qrcode',
'pillow',
'pynacl',
],
cmdclass={
'build': build,

View File

@ -5,7 +5,7 @@
<IDENT>11111</IDENT>
<CIVILITE />
<NOM>CALUIRE TEST</NOM>
<PRENOM>Enfant 1 </PRENOM>
<PRENOM />
<NAISSANCE>10/10/2013</NAISSANCE>
<SEXE>M</SEXE>
<NOMJF />

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

View File

@ -0,0 +1,7 @@
--uuid:7902e9bd-21a8-4632-8760-d79a67eb89a1
Content-Id: <rootpart*7902e9bd-21a8-4632-8760-d79a67eb89a1@example.jaxws.sun.com>
Content-Type: application/xop+xml;charset=utf-8;type="application/soap+xml"
Content-Transfer-Encoding: binary
<?xml version='1.0' encoding='UTF-8'?><S:Envelope xmlns:S="http://www.w3.org/2003/05/soap-envelope"><S:Header><Action xmlns="http://www.w3.org/2005/08/addressing">http://www.w3.org/2005/08/addressing/fault</Action><MessageID xmlns="http://www.w3.org/2005/08/addressing">uuid:1e213982-4a8d-4722-bb24-a2c5f48dd5c7</MessageID><RelatesTo xmlns="http://www.w3.org/2005/08/addressing">urn:uuid:93d3b273-c123-4277-b288-24a77b1e90dc</RelatesTo><To xmlns="http://www.w3.org/2005/08/addressing">http://www.w3.org/2005/08/addressing/anonymous</To></S:Header><S:Body><S:Fault xmlns:ns4="http://schemas.xmlsoap.org/soap/envelope/"><S:Code><S:Value>S:Receiver</S:Value></S:Code><S:Reason><S:Text xml:lang="fr">Votre guichet enregistreur ne couvre pas au moins une des communes souhaitées de la demande de logement.</S:Text></S:Reason></S:Fault></S:Body></S:Envelope>
--uuid:7902e9bd-21a8-4632-8760-d79a67eb89a1--

View File

@ -0,0 +1,7 @@
--uuid:7902e9bd-21a8-4632-8760-d79a67eb89a1
Content-Id: <rootpart*7902e9bd-21a8-4632-8760-d79a67eb89a1@example.jaxws.sun.com>
Content-Type: application/xop+xml;charset=utf-8;type="application/soap+xml"
Content-Transfer-Encoding: binary
<?xml version='1.0' encoding='UTF-8'?><S:Envelope xmlns:S="http://www.w3.org/2003/05/soap-envelope"><S:Header><Action xmlns="http://www.w3.org/2005/08/addressing">http://www.w3.org/2005/08/addressing/fault</Action><MessageID xmlns="http://www.w3.org/2005/08/addressing">uuid:1e213982-4a8d-4722-bb24-a2c5f48dd5c7</MessageID><RelatesTo xmlns="http://www.w3.org/2005/08/addressing">urn:uuid:93d3b273-c123-4277-b288-24a77b1e90dc</RelatesTo><To xmlns="http://www.w3.org/2005/08/addressing">http://www.w3.org/2005/08/addressing/anonymous</To></S:Header><S:Body><S:Fault xmlns:ns4="http://schemas.xmlsoap.org/soap/envelope/"><S:Code><S:Value>S:Receiver</S:Value></S:Code><S:Reason><S:Text xml:lang="fr">Unkown error</S:Text></S:Reason></S:Fault></S:Body></S:Envelope>
--uuid:7902e9bd-21a8-4632-8760-d79a67eb89a1--

View File

@ -0,0 +1,9 @@
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<ns2:preSubscribeSchoolPerimResponse xmlns:ns2="family.ws.maelis.sigec.com">
<resultSubscribeBean>
<returnMessage>E113 : Il existe déjà une inscription scolaire pour cet enfant</returnMessage>
</resultSubscribeBean>
</ns2:preSubscribeSchoolPerimResponse>
</soap:Body>
</soap:Envelope>

View File

@ -0,0 +1,13 @@
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<soap:Fault>
<faultcode>soap:Server</faultcode>
<faultstring>E25 : Cette personne nappartient pas à cette famille</faultstring>
<detail>
<ns1:MaelisFamilyException xmlns:ns1="family.ws.maelis.sigec.com">
<message xmlns:ns2="family.ws.maelis.sigec.com">E25 : Cette personne nappartient pas à cette famille</message>
</ns1:MaelisFamilyException>
</detail>
</soap:Fault>
</soap:Body>
</soap:Envelope>

View File

@ -0,0 +1,106 @@
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<ns2:readInvoicesResponse xmlns:ns2="ws.maelis.sigec.com">
<invoiceList>
<numInvoice>8</numInvoice>
<idInvoice>F10055591232</idInvoice>
<libelleTTF>CLAE JANVIER 2023</libelleTTF>
<regie>
<code>102</code>
<libelle>CANTINE / CLAE</libelle>
</regie>
<numFamily>1312</numFamily>
<name>SIMPSON MARGE</name>
<amountInvoice>952503.6</amountInvoice>
<amountPaid>952503.6</amountPaid>
<amountPaidTG>0</amountPaidTG>
<dateInvoice>2023-02-24T00:00:00+01:00</dateInvoice>
<dateDeadline>2023-03-24T00:00:00+01:00</dateDeadline>
<payer>
<num>261483</num>
<lastname>SIMPSON</lastname>
<firstname>MARGE</firstname>
<civility>MME</civility>
</payer>
<lineInvoiceList>
<numLine>1</numLine>
<numPers>261485</numPers>
<idActivity>A10049327692</idActivity>
<idUnit>A10049327693</idUnit>
<libelleLine>Calendrier CLAE SOIR 22/23</libelleLine>
<name>SIMPSON BART</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>11.0</quantity>
<unitPrice>22500.0</unitPrice>
<amountLine>247500</amountLine>
</lineInvoiceList>
<lineInvoiceList>
<numLine>2</numLine>
<numPers>261488</numPers>
<idActivity>A10049327692</idActivity>
<idUnit>A10049327693</idUnit>
<libelleLine>Calendrier CLAE SOIR 22/23</libelleLine>
<name>SIMPSON LISA</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>6.0</quantity>
<unitPrice>22500.0</unitPrice>
<amountLine>135000</amountLine>
</lineInvoiceList>
<lineInvoiceList>
<numLine>3</numLine>
<numPers>261485</numPers>
<idActivity>A10049327689</idActivity>
<idUnit>A10049327690</idUnit>
<libelleLine>Calendrier CLAE MATIN 22/23</libelleLine>
<name>SIMPSON BART</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>6.0</quantity>
<unitPrice>30000.0</unitPrice>
<amountLine>180000</amountLine>
</lineInvoiceList>
<lineInvoiceList>
<numLine>4</numLine>
<numPers>261489</numPers>
<idActivity>A10049327689</idActivity>
<idUnit>A10049327690</idUnit>
<libelleLine>Calendrier CLAE MATIN 22/23</libelleLine>
<name>SIMPSON MAGGIE</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>8.0</quantity>
<unitPrice>30000.0</unitPrice>
<amountLine>240000</amountLine>
</lineInvoiceList>
<lineInvoiceList>
<numLine>5</numLine>
<numPers>261490</numPers>
<idActivity>A10049327689</idActivity>
<idUnit>A10049327690</idUnit>
<libelleLine>Calendrier CLAE MATIN 22/23</libelleLine>
<name>SIMPSON HUGO</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>5.0</quantity>
<unitPrice>30000.0</unitPrice>
<amountLine>150000</amountLine>
</lineInvoiceList>
<lineInvoiceList>
<numLine>6</numLine>
<numPers>261485</numPers>
<idActivity>A10049327686</idActivity>
<idUnit>A10049327687</idUnit>
<libelleLine>Calendrier CLAE MIDI 22/23</libelleLine>
<name>SIMPSON BART</name>
<dateStart>2023-01-02T00:00:00+01:00</dateStart>
<dateEnd>2023-02-28T00:00:00+01:00</dateEnd>
<quantity>6.0</quantity>
<unitPrice>0.6</unitPrice>
<amountLine>3.6</amountLine>
</lineInvoiceList>
</invoiceList>
</ns2:readInvoicesResponse>
</soap:Body>
</soap:Envelope>

View File

@ -7,6 +7,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000004</idActivity><libelle>CC BOULE DE GOMME</libelle>
<idService>A10049329043</idService>
<manager1>
<lastname>SARRIMANE</lastname><firstname>Valerie</firstname><phone>05 61 62 08 49</phone><poste>ASSO</poste>
</manager1>
@ -22,6 +23,8 @@
<address>
<num>31</num><street1>RUE ROQUELAINE</street1><zipcode>31000</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -29,6 +32,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000005</idActivity><libelle>CC C.A.P.P.E.</libelle>
<idService>A10049329043</idService>
<manager1>
<lastname>CALAZEL</lastname><firstname>Régine</firstname><phone>05 61 55 46 50</phone><poste>ASSO</poste>
</manager1>
@ -44,6 +48,8 @@
<address>
<num>1</num><street1>IMP DES HERONS</street1><zipcode>31400</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -51,6 +57,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000202</idActivity><libelle>Relais Petite Enfance PRADETTES</libelle>
<idService>A10049329043</idService>
<manager1>
<lastname>BUSTAMENTE</lastname><firstname>Joëlle</firstname><phone>05 61 22 36 20</phone><poste>VT</poste>
</manager1>
@ -63,6 +70,8 @@
<address>
<num>12</num><street1>CHE DES PRADETTES</street1><zipcode>31100</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -70,6 +79,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000203</idActivity><libelle>Relais Petite Enfance REPUBLIQUE</libelle>
<idService>A10049329043</idService>
<manager1>
<lastname>LEHMANN</lastname><firstname>Valérie</firstname><phone>05 34 55 78 06</phone><poste>VT</poste>
</manager1>
@ -85,6 +95,8 @@
<address>
<num>48</num><street1>RUE DE LA REPUBLIQUE</street1><zipcode>31300</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -92,6 +104,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000204</idActivity><libelle>SECTEUR 1</libelle><obs2>Secteur 1</obs2>
<idService>A10049329043</idService>
<unitList>
<idUnit>M10053180935</idUnit><libelle>SECTEUR 1 - Réguliers</libelle><typeAcc>REGULAR</typeAcc>
</unitList>
@ -103,6 +116,8 @@
<address>
<num>1</num><street1>RUE DE SEBASTOPOL</street1><zipcode>31000</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -110,6 +125,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000205</idActivity><libelle>SECTEUR 2</libelle><obs2>Secteur 2</obs2>
<idService>A10049329043</idService>
<unitList>
<idUnit>M10053180939</idUnit><libelle>SECTEUR 2 - Occasionnels</libelle><typeAcc>OCCASIONAL</typeAcc>
</unitList>
@ -118,6 +134,8 @@
<address>
<num>1</num><street1>RUE DE SEBASTOPOL</street1><zipcode>31000</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -125,6 +143,7 @@
<code>CRECHCO</code><libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000206</idActivity><libelle>SECTEUR 3</libelle><obs2>Secteur 3</obs2>
<idService>A10049329043</idService>
<unitList>
<idUnit>M10053180941</idUnit><libelle>SECTEUR 3 - Réguliers</libelle><typeAcc>REGULAR</typeAcc>
</unitList>
@ -136,6 +155,8 @@
<address>
<num>1</num><street1>RUE DE SEBASTOPOL</street1><zipcode>31000</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -165,6 +186,7 @@
<code>CRECHFAM</code><libelle>Crèche familiale</libelle>
</activityType>
<idActivity>M10000000051</idActivity><libelle>C. FAMILIALE GERMAINE CHAUMEL</libelle>
<idService>A10049329048</idService>
<manager1>
<lastname>FRADET</lastname><firstname>FREDERIQUE</firstname><phone>05 34 40 70 59</phone><poste>VT</poste>
</manager1>
@ -180,6 +202,8 @@
<address>
<num>5</num><street1>PL STEPHANE HESSEL</street1><zipcode>31200</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -187,6 +211,7 @@
<code>CRECHFAM</code><libelle>Crèche familiale</libelle>
</activityType>
<idActivity>M10000000054</idActivity><libelle>C. FAMILIALE LAMPÀGIA</libelle>
<idService>A10049329048</idService>
<manager1>
<lastname>PELLIZZARI</lastname><firstname>SOPHIE</firstname><phone>05 31 22 98 65</phone><poste>VT</poste>
</manager1>
@ -199,6 +224,8 @@
<address>
<num>95</num><street1>RUE ERNEST RENAN</street1><zipcode>31200</zipcode><town>TOULOUSE</town>
</address>
<latitude>43.606099</latitude>
<longitude>1.430282</longitude>
</place>
</nurseryList>
<nurseryList>
@ -207,6 +234,7 @@
<libelle>Crèche collective</libelle>
</activityType>
<idActivity>M10000000001</idActivity>
<idService>A10049327627</idService>
<libelle>CC AMIDONNIERS</libelle>
<manager1>
<lastname>THOMAS</lastname>

View File

@ -5,15 +5,20 @@
<schoolYear>2022</schoolYear>
<dateStartYearSchool>2022-09-01T00:00:00+02:00</dateStartYearSchool>
<dateEndYearSchool>2023-07-07T00:00:00+02:00</dateEndYearSchool>
<dateStartSubscribeSchool>2022-04-01T00:00:00+02:00</dateStartSubscribeSchool>
<dateEndSubscribeSchool>2023-07-08T00:00:00+02:00</dateEndSubscribeSchool>
<dateStartSubscribeSchool>2022-09-01T00:00:00+02:00</dateStartSubscribeSchool>
<dateEndSubscribeSchool>2023-09-01T00:00:00+02:00</dateEndSubscribeSchool>
</yearSchoolList>
<yearSchoolList>
<schoolYear>2023</schoolYear>
<dateStartYearSchool>2023-09-01T00:00:00+02:00</dateStartYearSchool>
<dateStartYearSchool>2023-09-04T00:00:00+02:00</dateStartYearSchool>
<dateEndYearSchool>2024-07-07T00:00:00+02:00</dateEndYearSchool>
<dateStartSubscribeSchool>2022-12-01T00:00:00+01:00</dateStartSubscribeSchool>
<dateEndSubscribeSchool>2023-07-08T00:00:00+02:00</dateEndSubscribeSchool>
<dateStartSubscribeSchool>2022-09-01T00:00:00+02:00</dateStartSubscribeSchool>
<dateEndSubscribeSchool>2024-07-01T00:00:00+02:00</dateEndSubscribeSchool>
</yearSchoolList>
<yearSchoolList>
<schoolYear>2024</schoolYear>
<dateStartYearSchool>2024-09-01T00:00:00+02:00</dateStartYearSchool>
<dateEndYearSchool>2025-07-07T00:00:00+02:00</dateEndYearSchool>
</yearSchoolList>
</ns2:readYearSchoolListResponse>
</soap:Body>

237
tests/js/qrcode.test.js Normal file
View File

@ -0,0 +1,237 @@
import 'qrcode/qrcode-reader.js'
import { expect, test, vi} from 'vitest'
import ZXingBrowser from 'qrcode/zxing-browser.min.js'
import nacl from 'qrcode/nacl.min.js'
// private-key: 98f986e1afe2b546d264e45f00eb3f8d1b331bf3d2fa9e73bea3d8b2c9d90274
//
const okCodeData =
'3%73QJ0UK5G45S4XE0+GEUIKU662$D$K0RTM$4R7L7UX0V19FMFR5A++S6BNFR26 IJ7V15NE1NV2RN+T' +
'VH6%PHBDL*:01/69%EPEDUF7Y47EM6JA7MA79W5DOC$CC4S6G-CBW5C%6$Q6J*6JOCIPC4DC646FM6NE1' +
'AECPED-EDLFF QEGEC9VE634L%6 47%47C%6446D36K/EXVDAVCRWEV2C0/DUF7:967461R67:6OA7Y$5' +
'DE1HEC1WE..DF$DUF7/B8-ED:JCSTDR.C4LE1WE..DF$DUF771904E93DKOEXVDKPCF/DV CP9EIEC*ZC' +
'T34ZA8.Q6$Q6HB8A0'
const invalidCodeData = 'https://georges-abitbol.fr'
const invalidSignatureData =
'SVF.WB899RLB0%7NFKM.IWSBLTVZ65OQKD59REJ+ I/IDL960%HL%F 5EVVK397HQGIUK3OAPOP0RF/X' +
'L1FIKJG08J5 LE.G9%EPEDUF7M.C%47SW64DC4W5NF6$CC1S64VCBW53ECQZCGPCSG6C%6YZC:DC-96N' +
'E1AECPED-EDLFF QEGEC9VE634L%6 47%47C%6446D36K/EXVDAVCRWEV2C0/DUF7:967461R67:6OA7' +
'Y$5DE1HEC1WE..DF$DUF7/B8-ED:JCSTDR.C4LE1WE..DF$DUF771904E93DKOEXVDKPCF/DV CP9EIE' +
'C*ZCT34ZA8.Q6$Q6HB8A0'
const certificateWithoutValidity =
':U8$JSK1IVMJP$E06JCBCVQSIXFZ$HEDJ+S3:%1YN13BHJ$GFGLIN88AL5UNSFG8UG+YUV 2J6701809' +
'B3X3PP0TH9CN2R3IYEDB$DKWEOED0%EIEC ED3.DQ34%R83:6NW6XM8ME1.$E8UCE44QIC+96M.C9%6I' +
'M6E467W5LA76L6%47G%64W5ZJCYX6J*64EC6:6J$6'
const qrcodeReaderTest = test.extend({
mock: async ({ task }, use) => {
let resolveResult = undefined
let resultPromise = new Promise((resolve) => resolveResult = resolve)
let resolveResultHandled = undefined
const terminate = Symbol()
class MockBrowserQRCodeReader {
async decodeFromVideoDevice(device, element, callback) {
while(true) {
const result = await resultPromise
if(result === terminate) {
return
}
resultPromise = new Promise((resolve) => resolveResult = resolve)
callback({ text: result })
resolveResultHandled()
}
}
}
window.nacl = nacl
navigator.mediaDevices = true
const savedZXingBrowser = window.ZXingBrowser
window.ZXingBrowser = { BrowserQRCodeReader : MockBrowserQRCodeReader }
const reader = document.createElement('qrcode-reader')
// private-key: 98f986e1afe2b546d264e45f00eb3f8d1b331bf3d2fa9e73bea3d8b2c9d90274
reader.setAttribute('verify-key', '15dbdd38a2d8a2db1b4dd985da8da2b4e6b785b28db3fe0b34a10cfb3ba0aeb3')
document.append(reader)
await use({
reader,
scan: async (text) => {
const resultHandled = new Promise((resolve) => resolveResultHandled = resolve)
resolveResult(text)
await resultHandled
}
})
vi.useFakeTimers()
reader.remove()
vi.useRealTimers ()
resolveResult(terminate)
window.ZXingBrowser = savedZXingBrowser
navigator.mediaDevices = undefined
window.nacl = undefined
}
})
test('qrcode reader shows a warning message if not supported on platform', async ({mock}) => {
const reader = document.createElement('qrcode-reader')
reader.setAttribute('verify-key', 'f81af42f9f9422d2393859d40994a42cdb2ef68507f056292ac96d1de1f1af83')
document.append(reader)
expect(reader.innerText).toBe('not_supported')
})
qrcodeReaderTest('qrcode reader shows valid qrcode informations', async ({mock}) => {
const { reader, scan } = mock
vi.setSystemTime(new Date(2023, 11, 1))
await scan(okCodeData)
const popup = reader.querySelector('.qrcode-reader--popup')
const title = popup.querySelector('.qrcode-reader--popup-title')
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(false)
expect(title.innerText).toBe('valid')
const validity = popup.querySelector('.qrcode-reader--validity')
expect(validity.innerText).toMatch(/from :\s*31\/10\/2023 23:00:00\s*to :\s*01\/12\/2023 22:59:59/)
const labels = popup.querySelectorAll('.qrcode-reader--data-item-label')
expect(labels.length).toBe(3)
expect(labels[0].innerText).toMatch('last_name')
expect(labels[1].innerText).toMatch('first_name')
expect(labels[2].innerText).toMatch('license_plate')
const values = popup.querySelectorAll('.qrcode-reader--data-item-value')
expect(values.length).toBe(3)
expect(values[0].innerText).toMatch('Abitbol')
expect(values[1].innerText).toMatch('Georges')
expect(values[2].innerText).toMatch('HA-424-AH')
const closeButton = reader.querySelector('.qrcode-reader--close-popup-button')
closeButton.dispatchEvent(new Event('click'))
expect(popup.classList.contains('closed')).toBe(true)
})
qrcodeReaderTest('qrcode reader shows error on not yet valid or expired qrcodes', async ({mock}) => {
const { reader, scan } = mock
vi.setSystemTime(new Date(2023, 9, 31)) // monthes start at 0 index, wtf javascript
await scan(okCodeData)
const popup = reader.querySelector('.qrcode-reader--popup')
const title = popup.querySelector('.qrcode-reader--popup-title')
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(true)
expect(title.innerText).toBe('not_yet_valid')
vi.setSystemTime(new Date(2023, 11, 2)) // monthes start at 0 index, wtf javascript
await scan(okCodeData)
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(true)
expect(title.innerText).toBe('expired')
})
qrcodeReaderTest('qrcode reader shows error on invalid qrcode', async ({mock}) => {
const { reader, scan } = mock
await scan(invalidCodeData)
const popup = reader.querySelector('.qrcode-reader--popup')
const title = reader.querySelector('.qrcode-reader--popup-title')
const content = reader.querySelector('.qrcode-reader--popup-content')
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(true)
expect(title.innerText).toBe('invalid_title')
expect(content.innerText).toBe('invalid_qrcode')
const closeButton = reader.querySelector('.qrcode-reader--close-popup-button')
closeButton.dispatchEvent(new Event('click'))
expect(popup.classList.contains('closed')).toBe(true)
})
qrcodeReaderTest('qrcode reader shows error on invalid signature', async ({mock}) => {
const { reader, scan } = mock
await scan(invalidSignatureData)
const popup = reader.querySelector('.qrcode-reader--popup')
const title = reader.querySelector('.qrcode-reader--popup-title')
const content = reader.querySelector('.qrcode-reader--popup-content')
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(true)
expect(title.innerText).toBe('invalid_title')
expect(content.innerText).toBe('invalid_signature')
const closeButton = reader.querySelector('.qrcode-reader--close-popup-button')
closeButton.dispatchEvent(new Event('click'))
expect(popup.classList.contains('closed')).toBe(true)
})
qrcodeReaderTest('qrcode reader can toggle fullscreen', async ({mock}) => {
const { reader } = mock
const fullscreenButton = reader.querySelector('.qrcode-reader--fullscreen-button')
reader.requestFullscreen = vi.fn()
document.exitFullscreen = vi.fn()
fullscreenButton.dispatchEvent(new Event('click'))
expect(reader.requestFullscreen).toHaveBeenCalled()
expect(document.exitFullscreen).not.toHaveBeenCalled()
vi.clearAllMocks()
document.fullscreenElement = reader
fullscreenButton.dispatchEvent(new Event('click'))
expect(reader.requestFullscreen).not.toHaveBeenCalled()
expect(document.exitFullscreen).toHaveBeenCalled()
expect(reader.classList.contains('fullscreen')).toBe(false)
document.fullscreenElement = reader
reader.dispatchEvent(new Event('fullscreenchange'))
expect(reader.classList.contains('fullscreen')).toBe(true)
document.fullscreenElement = undefined
reader.dispatchEvent(new Event('fullscreenchange'))
expect(reader.classList.contains('fullscreen')).toBe(false)
})
qrcodeReaderTest('qrcode reader accepts certificate without validity dates', async ({mock}) => {
const { reader, scan } = mock
await scan(certificateWithoutValidity)
const popup = reader.querySelector('.qrcode-reader--popup')
const title = popup.querySelector('.qrcode-reader--popup-title')
const validity = popup.querySelector('.qrcode-reader--validity')
expect(popup.classList.contains('closed')).toBe(false)
expect(popup.classList.contains('error')).toBe(false)
expect(title.innerText).toBe('valid')
expect(validity.innerText.trim().split(/\s+/)).toStrictEqual(["from", ":", "always", "to", ":", "never"])
})

View File

@ -25,6 +25,7 @@ INSTALLED_APPS += ( # noqa pylint: disable=undefined-variable
'passerelle.contrib.greco',
'passerelle.contrib.grenoble_gru',
'passerelle.contrib.isere_ens',
'passerelle.contrib.isere_esrh',
'passerelle.contrib.iws',
'passerelle.contrib.lille_urban_card',
'passerelle.contrib.mdph13',
@ -49,6 +50,7 @@ INSTALLED_APPS += ( # noqa pylint: disable=undefined-variable
PASSERELLE_APP_BDP_ENABLED = True
PASSERELLE_APP_GDC_ENABLED = True
PASSERELLE_APP_STRASBOURG_EU_ENABLED = True
PASSERELLE_APP_TOULOUSE_MAELIS_ENABLED = True
TCL_URL_TEMPLATE = 'http://tcl.example.net/%s'
TCL_GEOJSON_URL_TEMPLATE = 'http://tcl.example.net/geojson/%s'

View File

@ -78,6 +78,22 @@ POSITIONS_RESPONSE = """
[{"position":"A","positionLib":"En attente","color":"0, 0, 0"},{"position":"E","positionLib":"Envoi","color":"190, 190, 0"},{"position":"C","positionLib":"En cours","color":"255, 0, 0"},{"position":"D","positionLib":"Envoi signataire","color":"255, 255, 113"},{"position":"T","positionLib":"Termin\u00e9","color":"0, 0, 0"},{"position":"I","positionLib":"\u00c9dition devis","color":"0, 255, 255"},{"position":"R","positionLib":"Refus","color":"255, 0, 0"},{"position":"V","positionLib":"V\u00e9rification","color":"0, 255, 0"},{"position":"F","positionLib":"Devis effectu\u00e9","color":"153, 204, 255"},{"position":"P","positionLib":"Livraison partielle","color":"255, 102, 0"},{"position":"L","positionLib":"Livraison","color":"128, 0, 0"}]
"""
VIEWS_RESPONSE = """
{"views":[{"apivId":"7","apivCode":"ASTECH_FORMDYN","apivNom":"Formulaires dynamiques - champs de saisie"},
{"apivId":"1","apivCode":"ASTECH_BIENS","apivNom":"Liste des biens"}]}
"""
COLUMNS_RESPONSE = """
{"columns":[{"code":"BIEN_ID","des":"Identifiant du bien AS-TECH","type":"NUM","length":"18"},
{"code":"SECTEUR","des":"Secteur","type":"TXT","length":"10"},
{"code":"GENRE","des":"Genre - 1er niveau obligatoire de classification ","type":"TXT","length":"5"}]}
"""
RESULTS_RESPONSE = """[{"BIEN_ID": "2219", "CODE_BIEN": "AC-849-YE", "DESIGNATION": "RENAULT KANGOO"},
{"BIEN_ID": "2220", "CODE_BIEN": "AC-933-EA", "DESIGNATION": "RENAULT MASTER"},
{"BIEN_ID": "2221", "CODE_BIEN": "AC-955-SE", "DESIGNATION": "RENAULT KANGOO"}]
"""
@mock.patch('passerelle.utils.Request.request')
def test_connections(mocked_request, app, setup):
@ -120,7 +136,7 @@ def test_connections(mocked_request, app, setup):
assert response.json['err_class'].endswith('APIError')
assert response.json['err_desc'] == 'AS-TECH response: 500 Crashhhh'
assert response.json['data']['error']['status'] == 500
assert response.json['data']['error']['content'] == 'crash'
assert 'crash' in response.json['data']['error']['content']
mocked_request.return_value = tests.utils.FakedResponse(content='not json', status_code=200, reason='OK')
response = app.get(endpoint)
assert response.json['err'] == 1
@ -435,3 +451,113 @@ def test_positions(mocked_auth, mocked_request, app, setup):
'id': 'A',
'text': 'En attente',
}
@mock.patch('passerelle.utils.Request.request')
@mock.patch('passerelle.apps.astech.models.ASTech.get_authorization')
def test_list_views(mocked_auth, mocked_request, app, setup):
mocked_auth.return_value = {'access_token': '4242', 'connection_id': 'TEST'}
endpoint = reverse(
'generic-endpoint',
kwargs={'connector': 'astech', 'slug': setup.slug, 'endpoint': 'list-views'},
)
mocked_request.return_value = tests.utils.FakedResponse(content=VIEWS_RESPONSE, status_code=200)
response = app.get(endpoint)
assert mocked_request.call_args[0][0] == 'get'
assert mocked_request.call_args[0][1].endswith('apicli/data/views')
assert response.json['data']
for r in response.json['data']:
assert 'id' in r
assert 'text' in r
@mock.patch('passerelle.utils.Request.request')
@mock.patch('passerelle.apps.astech.models.ASTech.get_authorization')
def test_view_columns(mocked_auth, mocked_request, app, setup):
mocked_auth.return_value = {'access_token': '4242', 'connection_id': 'TEST'}
endpoint = reverse(
'generic-endpoint',
kwargs={'connector': 'astech', 'slug': setup.slug, 'endpoint': 'get-view-columns'},
)
mocked_request.return_value = tests.utils.FakedResponse(content=COLUMNS_RESPONSE, status_code=200)
response = app.get(endpoint, params={'code': 'ASTECH_BIENS'})
assert mocked_request.call_args[0][0] == 'get'
assert mocked_request.call_args[0][1].endswith('apicli/data/ASTECH_BIENS/columns')
assert response.json['data']
for r in response.json['data']:
assert 'id' in r
assert 'text' in r
@mock.patch('passerelle.utils.Request.request')
@mock.patch('passerelle.apps.astech.models.ASTech.get_authorization')
def test_view_data(mocked_auth, mocked_request, app, setup):
mocked_auth.return_value = {'access_token': '4242', 'connection_id': 'TEST'}
endpoint = reverse(
'generic-endpoint',
kwargs={'connector': 'astech', 'slug': setup.slug, 'endpoint': 'get-view-data'},
)
mocked_request.return_value = tests.utils.FakedResponse(content=RESULTS_RESPONSE, status_code=200)
response = app.get(
endpoint, params={'code': 'ASTECH_BIENS', 'id_column': 'BIEN_ID', 'text_column': 'DESIGNATION'}
)
assert mocked_request.call_args[0][0] == 'post'
assert mocked_request.call_args[0][1].endswith('apicli/data/ASTECH_BIENS/results')
assert response.json['err'] == 0
assert response.json['data']
for r in response.json['data']:
assert 'id' in r
assert 'text' in r
response = app.get(
endpoint,
params={'code': 'ASTECH_BIENS', 'id_column': 'BIEN_ID', 'text_column': 'DESIGNATION', 'id': 2221},
)
assert len(response.json['data']) == 1
response = app.get(
endpoint,
params={'code': 'ASTECH_BIENS', 'id_column': 'BIEN_ID', 'text_column': 'DESIGNATION', 'q': 'KANGOO'},
)
assert len(response.json['data']) == 2
mocked_request.side_effect = [
tests.utils.FakedResponse(content=COLUMNS_RESPONSE, status_code=200),
tests.utils.FakedResponse(content=RESULTS_RESPONSE, status_code=200),
]
response = app.get(
endpoint,
params={
'code': 'ASTECH_BIENS',
'id_column': 'BIEN_ID',
'text_column': 'DESIGNATION',
'filters': 'GENRE=SIT;SECTEUR=S1',
},
)
assert mocked_request.call_args[0][0] == 'post'
assert mocked_request.call_args[0][1].endswith('apicli/data/ASTECH_BIENS/results')
assert mocked_request.call_args[1]['json'] == {
'data': {
'filters': [
{'field': 'GENRE', 'type': 'TXT', 'filter': {'value': 'SIT', 'operator': 'is_equal'}},
{'field': 'SECTEUR', 'type': 'TXT', 'filter': {'value': 'S1', 'operator': 'is_equal'}},
]
}
}
response = app.get(
endpoint,
params={
'code': 'ASTECH_BIENS',
'id_column': 'BIEN_ID',
'text_column': 'DESIGNATION',
'filters': 'GENRE=TESTING',
},
)
assert response.json['err'] == 1
assert response.json['err_desc'] == 'Value of GENRE exceeds authorized length (5)'

View File

@ -71,6 +71,41 @@ def test_worksrequest_status(app, connector):
assert json_resp['data']['RequestStateLabel'] == 'En attente'
def test_worksrequest_intervention_status(app, connector):
with responses.RequestsMock() as rsps:
rsps.get(
'https://atal.invalid/api/WorksRequests/GetInterventionStates',
status=200,
json=[
{
'RequestId': 'cc8b7f6b-8ccf-4938-a648-09678feda679',
'InterventionState': 2,
'WorkState': 2,
'InterventionNumber': 'IN23090003',
}
],
)
resp = app.get('/atal-rest/test/worksrequest-intervention-status?number=DIT23070011')
json_resp = resp.json
assert json_resp['err'] == 0
assert json_resp['data']['InterventionState'] == 2
assert json_resp['data']['WorkState'] == 2
assert json_resp['data']['WorkStateLabel'] == 'En cours'
def test_worksrequest_intervention_status_empty_list(app, connector):
with responses.RequestsMock() as rsps:
rsps.get(
'https://atal.invalid/api/WorksRequests/GetInterventionStates',
status=200,
json=[],
)
resp = app.get('/atal-rest/test/worksrequest-intervention-status?number=DIT23070011')
json_resp = resp.json
assert json_resp['err'] == 0
assert json_resp['data'] == {'WorkStateLabel': ''}
def test_worksrequests_single_attachment(app, connector):
with responses.RequestsMock() as rsps:
rsps.post('https://atal.invalid/api/WorksRequests/1/Attachments', status=200, body=b'')
@ -82,6 +117,51 @@ def test_worksrequests_single_attachment(app, connector):
assert json_resp['err'] == 0
def test_worksrequests_single_attachment_no_data(app, connector):
with responses.RequestsMock() as rsps:
params = {
'file': '',
}
resp = app.post_json(
'/atal-rest/test/worksrequests-single-attachment?worksrequests_id=1', params=params
)
json_resp = resp.json
assert json_resp['err'] == 0
assert len(rsps.calls) == 0
def test_worksrequests_single_attachment_string_not_empty(app, connector):
params = {
'file': 'aaa',
}
app.post_json(
'/atal-rest/test/worksrequests-single-attachment?worksrequests_id=1', params=params, status=400
)
def test_worksrequests_single_attachment_error(app, connector):
with responses.RequestsMock() as rsps:
rsps.post(
'https://atal.invalid/api/WorksRequests/1/Attachments',
status=400,
json={
'type': 'https://tools.ietf.org/html/rfc7231#section-6.5.1',
'title': 'Bad Request',
'status': 400,
'"detail': 'No content","traceId":"00-1034a23a6cfbb7c508aa7e125a8e9a52-4570fc75745b7d1d-00',
},
)
params = {
'file': {'filename': 'bla', 'content': base64.b64encode(b'bla').decode('utf-8')},
}
resp = app.post_json(
'/atal-rest/test/worksrequests-single-attachment?worksrequests_id=1', params=params
)
json_resp = resp.json
assert json_resp['err'] == 1
assert json_resp['data']['title'] == 'Bad Request'
def test_worksrequests_attachments(app, connector):
with responses.RequestsMock() as rsps:
rsps.post('https://atal.invalid/api/WorksRequests/Attachments', status=200, body=b'')
@ -97,6 +177,26 @@ def test_worksrequests_attachments(app, connector):
assert json_resp['err'] == 0
def test_worksrequests_attachments_no_data(app, connector):
with responses.RequestsMock() as rsps:
params = {
'files': ['', ''],
'worksrequests_ids': ['0', '1'],
}
resp = app.post_json('/atal-rest/test/worksrequests-attachments', params=params)
json_resp = resp.json
assert json_resp['err'] == 0
assert len(rsps.calls) == 0
def test_worksrequests_attachments_string_not_empty(app, connector):
params = {
'files': ['aa'],
'worksrequests_ids': ['0', '1'],
}
app.post_json('/atal-rest/test/worksrequests-attachments', params=params, status=400)
def test_worksrequests_attachments_error(app, connector):
with responses.RequestsMock() as rsps:
rsps.post(

52
tests/test_base.py Normal file
View File

@ -0,0 +1,52 @@
# Copyright (C) 2021 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import pytest
from passerelle.base.models import BaseResource
from tests.utils import ResponsesSoap
@pytest.fixture
def dummy_resource_class():
class DummyResource(BaseResource):
class logging_parameters:
log_level = 30
trace_emails = ''
def down(self):
return False
class Meta:
app_label = 'tests'
return DummyResource
def test_soap_client_method(dummy_resource_class):
with open('passerelle/contrib/toulouse_maelis/tools/wsdl/ActivityService.wsdl', 'rb') as fd:
wsdl_url = 'https://example.org/ActivityService?wsdl'
responses_soap = ResponsesSoap(
wsdl_url=wsdl_url,
wsdl_content=fd.read(),
)
with responses_soap():
resource = dummy_resource_class(
pk='x'
) # pk is necessary for the instance to be hashable and used a key in the soap_client cache
assert resource.soap_client(wsdl_url=wsdl_url) != resource.soap_client(wsdl_url=wsdl_url)
dummy_resource_class.soap_client_cache_timeout = 300
assert resource.soap_client(wsdl_url=wsdl_url) == resource.soap_client(wsdl_url=wsdl_url)

View File

@ -561,7 +561,7 @@ def test_family_info_endpoint(app, resource):
assert set(resp.json['data'].keys()) == {'family_id', 'CODE', 'MEMBRE', 'RESPONSABLE1', 'RESPONSABLE2'}
assert resp.json['data']['family_id'] == 'YYY'
assert resp.json['data']['MEMBRE'][0]['id'] == '11111'
assert resp.json['data']['MEMBRE'][0]['text'] == 'Enfant 1 CALUIRE TEST'
assert resp.json['data']['MEMBRE'][0]['text'] == 'CALUIRE TEST'
assert resp.json['data']['MEMBRE'][1]['id'] == '22222'
assert resp.json['data']['MEMBRE'][1]['text'] == 'Enfant 2 CALUIRE TEST'

View File

@ -18,7 +18,7 @@ import base64
import os
import re
from unittest import mock
from unittest.mock import Mock, call
from unittest.mock import MagicMock, Mock, PropertyMock, call
import py
import pytest
@ -35,8 +35,10 @@ from django.contrib.contenttypes.models import ContentType
from django.urls import reverse
from django.utils.encoding import force_bytes, force_str
from passerelle.apps.cmis import models
from passerelle.apps.cmis.models import CmisConnector
from passerelle.base.models import AccessRight, ApiUser
from passerelle.utils.jsonresponse import APIError
from tests.test_manager import login
@ -79,9 +81,8 @@ def test_uploadfile(app, setup, tmpdir, monkeypatch):
file_name = 'testfile.whatever'
file_content = 'aaaa'
monkeypatch.chdir(tmpdir)
import passerelle.apps.cmis.models
monkeypatch.setattr(passerelle.apps.cmis.models, 'CMISGateway', FakeCMISGateway)
monkeypatch.setattr(models, 'CMISGateway', FakeCMISGateway)
response = app.post_json(
'/cmis/slug-cmis/uploadfile',
params={
@ -120,9 +121,7 @@ def test_upload_file_metadata(app, setup, monkeypatch):
def createDocument(self, filename, contentFile, properties, contentType=None):
return Mock(properties=properties)
from passerelle.apps.cmis.models import CMISGateway
monkeypatch.setattr(CMISGateway, '_get_or_create_folder', lambda x, y: FakeFolder())
monkeypatch.setattr(models.CMISGateway, '_get_or_create_folder', lambda x, y: FakeFolder())
response = app.post_json(
'/cmis/slug-cmis/uploadfile',
params={
@ -296,14 +295,11 @@ def test_uploadfile_error_if_no_proper_base64_encoding(app, setup):
def test_uploadfile_cmis_gateway_error(app, setup, monkeypatch):
from passerelle.utils.jsonresponse import APIError
cmis_gateway = Mock()
cmis_gateway.create_doc.side_effect = APIError('some error')
cmis_gateway_cls = Mock(return_value=cmis_gateway)
import passerelle.apps.cmis.models
monkeypatch.setattr(passerelle.apps.cmis.models, 'CMISGateway', cmis_gateway_cls)
monkeypatch.setattr(models, 'CMISGateway', cmis_gateway_cls)
response = app.post_json(
'/cmis/slug-cmis/uploadfile',
params={
@ -315,83 +311,95 @@ def test_uploadfile_cmis_gateway_error(app, setup, monkeypatch):
assert response.json['err_desc'].startswith('some error')
def test_get_or_create_folder_already_existing(monkeypatch):
default_repository = Mock()
default_repository.getObjectByPath.return_value = 'folder'
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=default_repository))
import passerelle.apps.cmis.models
class TestGetOrCreateFolder:
@pytest.fixture
def default_repository(self, monkeypatch):
default_repository = MagicMock()
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=default_repository))
monkeypatch.setattr(models, 'CmisClient', cmis_client_cls)
return default_repository
monkeypatch.setattr(passerelle.apps.cmis.models, 'CmisClient', cmis_client_cls)
gateway = passerelle.apps.cmis.models.CMISGateway('cmis_endpoint', 'user', 'pass', Mock())
assert gateway._get_or_create_folder('/whatever') == 'folder'
default_repository.getObjectByPath.assert_has_calls([call('/whatever')])
@pytest.fixture
def gateway(self, default_repository):
return models.CMISGateway('cmis_endpoint', 'user', 'pass', Mock())
def test_get_or_create_folder_already_existing(self, gateway, default_repository):
default_repository.getObjectByPath.return_value = 'folder'
def test_get_or_create_folder_one_level_creation(monkeypatch):
root_folder = Mock()
root_folder.createFolder.return_value = 'folder'
default_repository = Mock(
rootFolder=root_folder, **{'getObjectByPath.side_effect': ObjectNotFoundException()}
)
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=default_repository))
import passerelle.apps.cmis.models
assert gateway._get_or_create_folder('/whatever') == 'folder'
default_repository.getObjectByPath.assert_has_calls([call('/whatever')])
monkeypatch.setattr(passerelle.apps.cmis.models, 'CmisClient', cmis_client_cls)
gateway = passerelle.apps.cmis.models.CMISGateway('cmis-url', 'user', 'password', Mock())
assert gateway._get_or_create_folder('/whatever') == 'folder'
default_repository.getObjectByPath.assert_has_calls([call('/whatever'), call('/whatever')])
root_folder.createFolder.assert_called_once_with('whatever')
def test_get_or_create_folder_one_level_creation(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = ObjectNotFoundException()
default_repository.root_folder = Mock(createFolder=Mock(return_value='folder'))
assert gateway._get_or_create_folder('/whatever') == 'folder'
def test_get_or_create_folder_two_level_creation(monkeypatch):
whatever_folder = Mock()
whatever_folder.createFolder.return_value = 'folder'
root_folder = Mock()
root_folder.createFolder.return_value = whatever_folder
default_repository = Mock(rootFolder=root_folder)
default_repository.getObjectByPath.side_effect = ObjectNotFoundException()
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=default_repository))
import passerelle.apps.cmis.models
default_repository.getObjectByPath.assert_has_calls([call('/whatever')])
default_repository.root_folder.createFolder.assert_called_once_with('whatever')
monkeypatch.setattr(passerelle.apps.cmis.models, 'CmisClient', cmis_client_cls)
gateway = passerelle.apps.cmis.models.CMISGateway('cmis_url', 'user', 'password', Mock())
assert gateway._get_or_create_folder('/whatever/man') == 'folder'
default_repository.getObjectByPath.assert_has_calls(
[call('/whatever/man'), call('/whatever'), call('/whatever/man')]
)
root_folder.createFolder.assert_called_once_with('whatever')
whatever_folder.createFolder.assert_called_once_with('man')
def test_get_or_create_folder_two_level_creation(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = [
ObjectNotFoundException(),
ObjectNotFoundException(),
]
default_repository.root_folder.createFolder.return_value.createFolder.return_value = 'folder'
assert gateway._get_or_create_folder('/whatever/man') == 'folder'
def test_get_or_create_folder_with_some_existing_and_some_not(monkeypatch):
whatever_folder = Mock()
whatever_folder.createFolder.return_value = 'folder'
assert default_repository.mock_calls == [
call.getObjectByPath('/whatever/man'),
call.getObjectByPath('/whatever'),
call.root_folder.createFolder('whatever'),
call.root_folder.createFolder().createFolder('man'),
]
def getObjectByPath(path):
if path == '/whatever':
return whatever_folder
elif path == '/whatever/man':
raise ObjectNotFoundException()
else:
raise Exception('I should not be called with: %s' % path)
def test_get_or_create_folder_with_some_existing_and_some_not(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = [
ObjectNotFoundException(),
mock.DEFAULT,
Exception('Boom!'),
]
default_repository.getObjectByPath.return_value.createFolder.return_value = 'folder'
root_folder = Mock()
default_repository = Mock(rootFolder=root_folder)
default_repository.getObjectByPath.side_effect = getObjectByPath
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=default_repository))
import passerelle.apps.cmis.models
assert gateway._get_or_create_folder('/whatever/man') == 'folder'
monkeypatch.setattr(passerelle.apps.cmis.models, 'CmisClient', cmis_client_cls)
gateway = passerelle.apps.cmis.models.CMISGateway('cmis_url', 'user', 'password', Mock())
assert gateway._get_or_create_folder('/whatever/man') == 'folder'
root_folder.createFolder.assert_not_called()
whatever_folder.createFolder.assert_called_once_with('man')
assert default_repository.mock_calls == [
call.getObjectByPath('/whatever/man'),
call.getObjectByPath('/whatever'),
call.getObjectByPath().createFolder('man'),
]
def test_get_or_create_folder_permission_denied_on_get_object_by_path(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = [
ObjectNotFoundException(),
PermissionDeniedException(),
]
with pytest.raises(APIError, match=r'CMIS server denied reading folder /whatever'):
gateway._get_or_create_folder('/whatever/man')
def test_get_or_create_folder_permission_denied_on_create(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = [
ObjectNotFoundException(),
mock.DEFAULT,
Exception('Boom!'),
]
default_repository.getObjectByPath.return_value.createFolder.side_effect = PermissionDeniedException()
with pytest.raises(APIError, match=r'CMIS server denied creating folder /whatever/man'):
gateway._get_or_create_folder('/whatever/man')
def test_get_or_create_folder_permission_denied_on_root_folder(self, gateway, default_repository):
default_repository.getObjectByPath.side_effect = ObjectNotFoundException()
type(default_repository).root_folder = PropertyMock(side_effect=PermissionDeniedException())
with pytest.raises(APIError, match=r'CMIS server denied reading folder /'):
gateway._get_or_create_folder('/whatever')
def test_create_doc():
from passerelle.apps.cmis.models import CMISGateway
gateway = CMISGateway('cmis_url', 'user', 'password', Mock())
gateway = models.CMISGateway('cmis_url', 'user', 'password', Mock())
folder = Mock()
folder.createDocument.return_value = 'doc'
gateway._get_or_create_folder = Mock(return_value=folder)
@ -413,10 +421,7 @@ def test_create_doc():
],
)
def test_wrap_cmis_error(app, setup, monkeypatch, cmis_exc, err_msg):
from passerelle.apps.cmis.models import wrap_cmis_error
from passerelle.utils.jsonresponse import APIError
@wrap_cmis_error
@models.wrap_cmis_error
def dummy_func():
raise cmis_exc('some error')
@ -493,9 +498,8 @@ def test_cmis_types_view(setup, app, admin_user, monkeypatch):
repo = FakeCmisRepo(root_types)
cmis_client_cls = Mock(return_value=Mock(spec=CmisClient, defaultRepository=repo))
import passerelle.apps.cmis.models
monkeypatch.setattr(passerelle.apps.cmis.models, 'CmisClient', cmis_client_cls)
monkeypatch.setattr(models, 'CmisClient', cmis_client_cls)
app = login(app)
resp = app.get('/cmis/slug-cmis/')
@ -568,9 +572,8 @@ def test_cmis_check_status(app, setup, monkeypatch):
cmis_gateway = Mock()
type(cmis_gateway).repo = mock.PropertyMock(side_effect=CmisException)
cmis_gateway_cls = Mock(return_value=cmis_gateway)
import passerelle.apps.cmis.models
monkeypatch.setattr(passerelle.apps.cmis.models, 'CMISGateway', cmis_gateway_cls)
monkeypatch.setattr(models, 'CMISGateway', cmis_gateway_cls)
with pytest.raises(CmisException):
setup.check_status()

View File

@ -8,7 +8,7 @@ from passerelle.apps.base_adresse.models import BaseAdresse
def test_cron_frequencies(db):
for frequency in ('hourly', 'daily', 'weekly', 'monthly'):
for frequency in ('every5min', 'hourly', 'daily', 'weekly', 'monthly'):
call_command('cron', frequency)
with pytest.raises(CommandError):
call_command('cron', 'randomly')

View File

@ -110,6 +110,7 @@ def test_proxy_logger(mocked_get, caplog, app, arcgis):
'template': '{{ attributes.NOM }}',
'id_template': '{{ attributes.NUMERO }}',
},
headers={'Publik-Caller-URL': 'https://wcs.invalid/backoffice/management/foo/1/'},
status=200,
)
@ -122,6 +123,7 @@ def test_proxy_logger(mocked_get, caplog, app, arcgis):
assert log.extra['connector'] == 'arcgis'
assert log.extra['connector_endpoint'] == 'mapservice-query'
assert log.extra['connector_endpoint_method'] == 'GET'
assert log.extra['publik_caller_url'] == 'https://wcs.invalid/backoffice/management/foo/1/'
assert '/arcgis/test/mapservice-query?' in log.extra['connector_endpoint_url']
# Resource Generic Logger

282
tests/test_isere_esrh.py Normal file
View File

@ -0,0 +1,282 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import urllib.parse
import httmock
import pytest
from requests import ConnectionError, RequestException, Timeout
from passerelle.contrib.isere_esrh.models import IsereESRH
from tests.utils import setup_access_rights
@pytest.fixture()
def connector(db):
return setup_access_rights(IsereESRH.objects.create(slug='test', base_url='http://esrh.net'))
def test_connection_error_handling(app, connector):
def connection_error(url, request):
raise ConnectionError('oops')
def timeout(url, request):
raise Timeout('oops')
def request_error(url, request):
raise RequestException('oops')
for handler in [connection_error, timeout, request_error]:
with httmock.HTTMock(handler):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38', status=200)
assert response.json['err_desc'] == 'HTTP request failed'
assert response.json['data'] == {'exception': 'oops'}
def invalid_json(url, request):
return httmock.response(200, 'bad_json')
with httmock.HTTMock(invalid_json):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38', status=200)
assert response.json['err_desc'] == 'ESRH returned invalid json'
assert 'Expecting value' in response.json['data']['exception']
def invalid_json_content(url, request):
return httmock.response(200, '[]')
with httmock.HTTMock(invalid_json_content):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38', status=200)
assert (
response.json['err_desc']
== 'ESRH returned malformed json : expecting a dictionary with a "values" key containing a list of objects.'
)
def test_official(app, connector, freezer):
freezer.move_to('1871-03-18 13:13:00')
@httmock.urlmatch()
def error_handler(url, request):
assert False, 'should not be reached'
@httmock.urlmatch(path=r'^/api/v2/Agent$', query='numero=004242&collectivite=CG38')
def mock_official_not_found(url, request):
return httmock.response(200, {'values': []})
with httmock.HTTMock(mock_official_not_found, error_handler):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38')
assert response.json['err'] == 0
assert response.json['data'] is None
@httmock.urlmatch(path=r'^/api/v2/Agent$', query='numero=004242&collectivite=CG38')
def mock_malformed_response(url, request):
return httmock.response(200, {'values': [{}]})
with httmock.HTTMock(mock_malformed_response, error_handler):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38')
assert response.json['err'] == 1
assert (
response.json['err_desc']
== 'Malformed response : values elements are expected to be objects with an "agentId" key'
)
@httmock.urlmatch(path=r'^/api/v2/Agent$', query='numero=004242&collectivite=CG38')
def mock_official(url, request):
return httmock.response(
200,
{
'values': [
{
'nom': 'Abitbol',
'agentId': 12,
'prenom': 'Georges',
'matricule': '004242',
'observation': 'Agent polyvalent en gestion de classe',
},
]
},
)
@httmock.urlmatch(
path=r'^/api/v2/Agent/12/DossiersStatutaire$',
query=f'aDate={urllib.parse.quote("1871-03-18T13:13:00+00:00")}',
)
def mock_empty_file(url, request):
return httmock.response(200, {'values': []})
with httmock.HTTMock(mock_official, mock_empty_file, error_handler):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38')
assert response.json['data'] == {
'agentId': 12,
'nom': 'Abitbol',
'prenom': 'Georges',
'matricule': '004242',
'observation': 'Agent polyvalent en gestion de classe',
'DossiersStatutaire': [],
}
@httmock.urlmatch(
path=r'^/api/v2/Agent/12/DossiersStatutaire$',
query=f'aDate={urllib.parse.quote("1871-03-18T13:13:00+00:00")}',
)
def mock_file(url, request):
return httmock.response(
200,
{
'values': [
{
'dossierStatutaireId': 1,
'grade': {
'gradeId': 2,
'code': 'T34',
'libelle': 'Adjoint du sherif',
'cadreEmploiId': 3,
},
}
]
},
)
with httmock.HTTMock(mock_official, mock_file, error_handler):
response = app.get('/isere-esrh/test/official?number=004242&authority=CG38')
assert response.json['data'] == {
'agentId': 12,
'nom': 'Abitbol',
'prenom': 'Georges',
'matricule': '004242',
'observation': 'Agent polyvalent en gestion de classe',
'DossiersStatutaire': [
{
'dossierStatutaireId': 1,
'grade': {
'gradeId': 2,
'code': 'T34',
'libelle': 'Adjoint du sherif',
'cadreEmploiId': 3,
},
}
],
}
def test_entities(app, connector, freezer):
freezer.move_to('1871-03-18 13:13:00')
@httmock.urlmatch()
def error_handler(url, request):
assert False, 'should not be reached'
@httmock.urlmatch(
path=r'^/api/v2/Entite$', query=f'aDate={urllib.parse.quote("1871-03-18T13:13:00+00:00")}'
)
def mock_entites(url, request):
return httmock.response(
200,
{
'values': [
{
'entiteId': 1,
'code': '6500',
'libelle': 'dir. de la classe internationale',
'region': 'ouest',
},
{'entiteId': 2, 'code': '650001', 'libelle': 'sce. des sapes', 'region': 'nord'},
{
'entiteId': 3,
'code': '6400',
'libelle': 'dir. des dinosaures de droite',
'region': 'sud',
},
]
},
)
entity_1 = {
'id': 1,
'text': 'dir. de la classe internationale',
'code': '6500',
'region': 'ouest',
'entiteId': 1,
'libelle': 'dir. de la classe internationale',
}
entity_2 = {
'id': 2,
'text': 'sce. des sapes',
'code': '650001',
'region': 'nord',
'entiteId': 2,
'libelle': 'sce. des sapes',
}
entity_3 = {
'id': 3,
'code': '6400',
'text': 'dir. des dinosaures de droite',
'region': 'sud',
'entiteId': 3,
'libelle': 'dir. des dinosaures de droite',
}
with httmock.HTTMock(mock_entites, error_handler):
response = app.get('/isere-esrh/test/entities')
assert response.json['data'] == [entity_1, entity_2, entity_3]
response = app.get('/isere-esrh/test/entities?label_pattern=^dir\\..*')
assert response.json['data'] == [entity_1, entity_3]
response = app.get('/isere-esrh/test/entities?code_pattern=^6500\\d%2B')
assert response.json['data'] == [entity_2]
response = app.get('/isere-esrh/test/entities?code_pattern=^6500&label_pattern=^dir\\..*')
assert response.json['data'] == [entity_1]
def test_job_types(app, connector, freezer):
freezer.move_to('1871-03-18 13:13:00')
now = urllib.parse.quote('1871-03-18T13:13:00+00:00')
@httmock.urlmatch()
def error_handler(url, request):
assert False, 'should not be reached'
@httmock.urlmatch(
path=r'^/api/v2/Poste$', query=f'codeCollectivite=CG38&avecLibellePoste=True&aDate={now}'
)
def mock_entites(url, request):
return httmock.response(
200,
{
'values': [
{'posteId': 1, 'libelles': [{'libelle': 'Patron de l\'auberge'}], 'ravioles': 'non'},
{'posteId': 2, 'libelles': [], 'ravioles': 'oui'},
{'posteId': 3, 'ravioles': 'non'},
]
},
)
with httmock.HTTMock(mock_entites, error_handler):
response = app.get('/isere-esrh/test/job-types?authority=CG38')
assert response.json['data'] == [
{
'id': 1,
'text': 'Patron de l\'auberge',
'ravioles': 'non',
'posteId': 1,
'libelles': [{'libelle': 'Patron de l\'auberge'}],
},
{'id': 2, 'text': 'N/A', 'ravioles': 'oui', 'posteId': 2, 'libelles': []},
{'id': 3, 'text': 'N/A', 'ravioles': 'non', 'posteId': 3},
]

304
tests/test_matrix42.py Normal file
View File

@ -0,0 +1,304 @@
from unittest import mock
import pytest
from django.contrib.contenttypes.models import ContentType
from passerelle.apps.matrix42.models import Matrix42
from passerelle.base.models import AccessRight, ApiUser
from tests.utils import FakedResponse, generic_endpoint_url
pytestmark = pytest.mark.django_db
TOKEN = '{"RawToken": "token2","LifeTime":"2200-09-23T06:39:31.5285469Z"}'
USERS = (
'{"Result":[{"ID":"a9386c3e-cb7a-ed11-a3bb-000d3aaa0172","DisplayString":"User1, Leo",'
'"Expression-TypeCase":"46c86c68-42ae-4089-8398-6e4140fe8658",'
'"Expression-TypeID":"46c86c68-42ae-4089-8398-6e4140fe8658"},'
'{"ID":"12386c3e-cb7a-ed11-a3bb-00bd3aaa0111","DisplayString":"User2, Blah",'
'"Expression-TypeCase":"46c86c68-42ae-4089-8398-6e4140fe8658",'
'"Expression-TypeID":"46c86c68-42ae-4089-8398-6e4140fe8658"}],'
'"Schema":[{"ColumnName":"ID","ColumnType":"GuidType","Localizable":false},'
'{"ColumnName":"DisplayString","ColumnType":"StringType","Localizable":false}]}'
)
USER = '{"ID":"a9386c3e-cb7a-ed11-a3bb-000d3aaa0172","DisplayString":"User1, Leo"}'
OBJECT = '{"ID":"424242","SPSActivityClassBase":{"TicketNumber":"TCK0000153","TimeStamp":"AAAAAAHlWr4="}}'
@pytest.fixture
def matrix42():
return Matrix42.objects.create(slug='test', base_url='https://matrix42.example.net/api/', token='token1')
@mock.patch('passerelle.utils.Request.request')
def test_matrix42_fragment(mocked_request, app, matrix42):
endpoint = generic_endpoint_url('matrix42', 'fragment', slug=matrix42.slug)
assert endpoint == '/matrix42/test/fragment'
endpoint += '/SPSUserClassBase'
params = {}
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USERS, status_code=200),
]
resp = app.get(endpoint, params=params, status=403)
assert mocked_request.call_count == 0
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'django.core.exceptions.PermissionDenied'
# open access
api = ApiUser.objects.create(username='all', keytype='', key='')
obj_type = ContentType.objects.get_for_model(matrix42)
AccessRight.objects.create(
codename='can_access', apiuser=api, resource_type=obj_type, resource_pk=matrix42.pk
)
# get all users
resp = app.get(endpoint, params=params, status=200)
assert mocked_request.call_count == 2
get_token, get_users = mocked_request.call_args_list
assert get_token[0] == (
'POST',
'https://matrix42.example.net/api/ApiToken/GenerateAccessTokenFromApiToken',
)
assert get_token[1]['json'] == get_token[1]['params'] == None
assert get_token[1]['headers']['Authorization'] == 'Bearer token1'
assert get_users[0] == (
'GET',
'https://matrix42.example.net/api/data/fragments/SPSUserClassBase/schema-info',
)
assert get_users[1]['json'] is None
assert get_users[1]['params'] == {}
assert get_users[1]['headers']['Authorization'] == 'Bearer token2'
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
assert resp.json['data'][0]['id'] == resp.json['data'][0]['ID'] == 'a9386c3e-cb7a-ed11-a3bb-000d3aaa0172'
assert resp.json['data'][0]['text'] == resp.json['data'][0]['DisplayString'] == 'User1, Leo'
# get all users, with parameters
params['id_template'] = 'id:{{ID}}'
params['template'] = 'ds:{{DisplayString}}'
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USERS, status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
assert resp.json['data'][0]['id'] == 'id:a9386c3e-cb7a-ed11-a3bb-000d3aaa0172'
assert resp.json['data'][0]['text'] == 'ds:User1, Leo'
# search user
params['q'] = 'User'
resp = app.get(endpoint, params=params, status=400)
assert resp.json['err'] == 1
assert resp.json['err_desc'] == 'q needs a search_column parameter'
params['search_column'] = 'DisplayString'
params['columns'] = 'DisplayString'
mocked_request.reset_mock()
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USERS, status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
_, get_users = mocked_request.call_args_list
assert get_users[0] == (
'GET',
'https://matrix42.example.net/api/data/fragments/SPSUserClassBase/schema-info',
)
assert get_users[1]['params'] == {'columns': 'DisplayString', 'where': "DisplayString LIKE '%User%'"}
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
assert resp.json['data'][0]['id'] == 'id:a9386c3e-cb7a-ed11-a3bb-000d3aaa0172'
assert resp.json['data'][0]['text'] == 'ds:User1, Leo'
# filter
del params['q']
params['filter'] = '1=1'
mocked_request.reset_mock()
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USERS, status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
_, get_users = mocked_request.call_args_list
assert get_users[0] == (
'GET',
'https://matrix42.example.net/api/data/fragments/SPSUserClassBase/schema-info',
)
assert get_users[1]['params'] == {
'columns': 'DisplayString',
'where': '1=1',
}
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
# filter & q
params['q'] = 'User'
params['filter'] = '1=1'
mocked_request.reset_mock()
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USERS, status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
_, get_users = mocked_request.call_args_list
assert get_users[0] == (
'GET',
'https://matrix42.example.net/api/data/fragments/SPSUserClassBase/schema-info',
)
assert get_users[1]['params'] == {
'columns': 'DisplayString',
'where': "DisplayString LIKE '%User%' AND 1=1",
}
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
# get one user
del params['q']
params['id'] = 'a9386c3e-cb7a-ed11-a3bb-000d3aaa0172'
mocked_request.reset_mock()
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=USER, status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
_, get_users = mocked_request.call_args_list
assert get_users[0] == (
'GET',
'https://matrix42.example.net/api/data/fragments/SPSUserClassBase/a9386c3e-cb7a-ed11-a3bb-000d3aaa0172',
)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 1
assert resp.json['data'][0]['id'] == 'id:a9386c3e-cb7a-ed11-a3bb-000d3aaa0172'
assert resp.json['data'][0]['text'] == 'ds:User1, Leo'
@mock.patch('passerelle.utils.Request.request')
def test_matrix42_bad_rawtoken(mocked_request, app, matrix42):
endpoint = generic_endpoint_url('matrix42', 'fragment', slug=matrix42.slug)
endpoint += '/SPSUserClassBase'
params = {}
# open access
api = ApiUser.objects.create(username='all', keytype='', key='')
obj_type = ContentType.objects.get_for_model(matrix42)
AccessRight.objects.create(
codename='can_access', apiuser=api, resource_type=obj_type, resource_pk=matrix42.pk
)
# no RawToken
mocked_request.side_effect = [
FakedResponse(content='{}', status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert resp.json['err_desc'] == 'Matrix42 not returned a RawToken: {}'
# bad JSON
mocked_request.side_effect = [
FakedResponse(content='crashme', status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert 'invalid JSON' in resp.json['err_desc']
# not a dict
mocked_request.side_effect = [
FakedResponse(content='"crashme"', status_code=200),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert 'not returned a dict' in resp.json['err_desc']
# Matrix42 error
mocked_request.side_effect = [
FakedResponse(content='{"ExceptionName":"NotFound","Message":"4o4"}', status_code=404),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert resp.json['err_desc'] == 'Matrix42 returned 404 response, ExceptionName "NotFound": 4o4'
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=500),
]
resp = app.get(endpoint, params=params, status=200)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert resp.json['err_desc'] == 'Matrix42 returned status code 500'
@mock.patch('passerelle.utils.Request.request')
def test_matrix42_object(mocked_request, app, matrix42):
api = ApiUser.objects.create(username='all', keytype='', key='')
obj_type = ContentType.objects.get_for_model(matrix42)
AccessRight.objects.create(
codename='can_access', apiuser=api, resource_type=obj_type, resource_pk=matrix42.pk
)
# create-object
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content='"424242"', status_code=200),
]
endpoint = generic_endpoint_url('matrix42', 'create-object', slug=matrix42.slug)
endpoint += '/SPSActivityTypeTicket'
payload = {
'SPSActivityClassBase/Subject': 'incident subject',
'SPSActivityClassBase/Category': 'category-id',
}
resp = app.post_json(endpoint, params=payload, status=200)
assert mocked_request.call_count == 2
get_token, post_object = mocked_request.call_args_list
assert get_token[0] == (
'POST',
'https://matrix42.example.net/api/ApiToken/GenerateAccessTokenFromApiToken',
)
assert get_token[1]['json'] == get_token[1]['params'] == None
assert get_token[1]['headers']['Authorization'] == 'Bearer token1'
assert post_object[0] == (
'POST',
'https://matrix42.example.net/api/data/objects/SPSActivityTypeTicket',
)
assert post_object[1]['json'] == {
'SPSActivityClassBase': {
'Subject': 'incident subject',
'Category': 'category-id',
}
}
assert post_object[1]['params'] is None
assert post_object[1]['headers']['Authorization'] == 'Bearer token2'
assert resp.json['err'] == 0
assert resp.json['data'] == '424242'
# get-object
mocked_request.reset_mock()
mocked_request.side_effect = [
FakedResponse(content=TOKEN, status_code=200),
FakedResponse(content=OBJECT, status_code=200),
]
endpoint = generic_endpoint_url('matrix42', 'get-object', slug=matrix42.slug)
endpoint += '/SPSActivityTypeTicket/424242' # ciName + id
resp = app.get(endpoint, status=200)
assert mocked_request.call_count == 2
get_token, get_object = mocked_request.call_args_list
assert get_token[0] == (
'POST',
'https://matrix42.example.net/api/ApiToken/GenerateAccessTokenFromApiToken',
)
assert get_token[1]['json'] == get_token[1]['params'] == None
assert get_token[1]['headers']['Authorization'] == 'Bearer token1'
assert get_object[0] == (
'GET',
'https://matrix42.example.net/api/data/objects/SPSActivityTypeTicket/424242',
)
assert get_object[1]['json'] == get_object[1]['params'] == None
assert get_object[1]['headers']['Authorization'] == 'Bearer token2'
assert resp.json['err'] == 0
assert resp.json['data'] == {
'ID': '424242',
'SPSActivityClassBase': {'TicketNumber': 'TCK0000153', 'TimeStamp': 'AAAAAAHlWr4='},
}

View File

@ -302,6 +302,10 @@ def test_create_aec_demand_type(app, setup, aec_payload):
assert root.find('DemandeActe/Titulaire/Filiation/Pere/Nom').text == 'Yamamoto'
assert root.find('DemandeActe/Titulaire/Filiation/Pere/Prenoms').text == 'Ryu'
assert os.path.exists(basedir)
setup.daily()
assert not os.path.exists(basedir)
def test_create_aec_demand_type_with_user_comment(app, setup, aec_payload):
AEC_PAYLOAD = dict(aec_payload)

View File

@ -16,7 +16,8 @@ FAKE_FEATURE_INFO = '''<?xml version="1.0" encoding="UTF-8"?>
<msGMLOutput
xmlns:gml="http://www.opengis.net/gml"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:extra="http://example.net/">
<cad_cadastre.cadparcelle_layer>
<gml:name>Parcelle cadastrale (Plan cadastral informatise du Grand Lyon)</gml:name>
<cad_cadastre.cadparcelle_feature>
@ -33,6 +34,7 @@ FAKE_FEATURE_INFO = '''<?xml version="1.0" encoding="UTF-8"?>
<indice>Parcelle figuree au plan</indice>
<arpentage>Arpentee</arpentage>
<gid>75404</gid>
<extra:foo>bar</extra:foo>
</cad_cadastre.cadparcelle_feature>
</cad_cadastre.cadparcelle_layer>
</msGMLOutput>'''
@ -433,6 +435,12 @@ def test_feature_info(mocked_get, app, connector):
]
== 'Particulier'
)
assert 'name' not in resp.json['data']['cad_cadastrecadparcelle_layer']
assert (
'boundedBy'
not in resp.json['data']['cad_cadastrecadparcelle_layer']['cad_cadastrecadparcelle_feature']
)
assert 'foo' in resp.json['data']['cad_cadastrecadparcelle_layer']['cad_cadastrecadparcelle_feature']
connector.projection = 'EPSG:4326'
connector.save()
resp = app.get(endpoint, params={'lat': '45.796890', 'lon': '4.784140'})

View File

@ -1,3 +1,19 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import itertools
import logging
@ -5,6 +21,7 @@ import logging
import pytest
import requests
from django.core.exceptions import ValidationError
from django.db import transaction
from django.utils.log import AdminEmailHandler
from httmock import HTTMock
@ -13,6 +30,7 @@ from passerelle.apps.feeds.models import Feed
from passerelle.base.models import ProxyLogger, ResourceLog
from passerelle.contrib.stub_invoices.models import StubInvoicesConnector
from passerelle.utils.api import endpoint
from passerelle.utils.defer import run_later_scope
from passerelle.utils.jsonresponse import APIError
from tests.test_availability import down_mock, up_mock
@ -388,3 +406,33 @@ def test_proxy_logger_bytes(db, connector):
base_logger.debug('test', extra={'payload': b'\xff\xff'})
log = ResourceLog.objects.latest('id')
assert log.extra == {'payload': '\\xff\\xff'}
def test_log_in_transaction(transactional_db, connector):
qs = ResourceLog.objects.all()
assert not qs.exists()
class MyError(Exception):
pass
# without run_later_scope logs inside transactions are lost
try:
with transaction.atomic():
connector.logger.info('info')
connector.logger.warning('warning')
raise MyError
except MyError:
pass
assert qs.count() == 0
# with run_later_scope logs inside transaction are kept, because they
# inserted in the db after the rollback
try:
with run_later_scope():
with transaction.atomic():
connector.logger.info('info')
connector.logger.warning('warning')
raise MyError
except MyError:
pass
assert qs.count() == 2

252
tests/test_qrcode.py Normal file
View File

@ -0,0 +1,252 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2022 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import uuid
from datetime import timezone
import pytest
from passerelle.apps.qrcode.models import Certificate, QRCodeConnector, Reader
from tests.utils import generic_endpoint_url, setup_access_rights
@pytest.fixture()
def connector(db):
return setup_access_rights(
QRCodeConnector.objects.create(
slug='test',
key='5e8176e50d45b67e9db875d6006edf3ba805ff4ef4d945327012db4c797be1be',
)
)
def test_save_certificate(app, connector):
endpoint = generic_endpoint_url('qrcode', 'save-certificate', slug=connector.slug)
result = app.post_json(
endpoint,
params={
'data': {
'first_name': 'Georges',
'last_name': 'Abitbol',
},
'validity_start': '2022-01-01 10:00:00+00:00',
'validity_end': '2023-01-01 10:00:00+00:00',
},
)
assert result.json['err'] == 0
certificate_uuid = result.json['data']['uuid']
assert result.json['data']['qrcode_url'] == f'http://testserver/qrcode/test/get-qrcode/{certificate_uuid}'
certificate = connector.certificates.get(uuid=certificate_uuid)
assert certificate.data['first_name'] == 'Georges'
assert certificate.data['last_name'] == 'Abitbol'
assert certificate.validity_start == datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
assert certificate.validity_end == datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
result = app.post_json(
f'{endpoint}/{certificate_uuid}',
params={
'data': {
'first_name': 'Robert',
'last_name': 'Redford',
},
'validity_start': '2024-01-01T10:00:00+00:00',
'validity_end': '2025-01-01T10:00:00+00:00',
},
)
certificate.refresh_from_db()
assert certificate.data['first_name'] == 'Robert'
assert certificate.data['last_name'] == 'Redford'
assert certificate.validity_start == datetime.datetime(2024, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
assert certificate.validity_end == datetime.datetime(2025, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
def test_get_certificate(app, connector):
certificate = connector.certificates.create(
data={
'first_name': 'Georges',
'last_name': 'Abitbol',
},
validity_start=datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
validity_end=datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
)
endpoint = generic_endpoint_url('qrcode', 'get-certificate', slug=connector.slug)
result = app.get(f'{endpoint}/{certificate.uuid}')
assert result.json == {
'err': 0,
'data': {
'uuid': str(certificate.uuid),
'data': {'first_name': 'Georges', 'last_name': 'Abitbol'},
'validity_start': '2022-01-01T10:00:00+00:00',
'validity_end': '2023-01-01T10:00:00+00:00',
'qrcode_url': f'http://testserver/qrcode/test/get-qrcode/{certificate.uuid}',
},
}
def test_get_qrcode(app, connector):
certificate = connector.certificates.create(
uuid=uuid.UUID('12345678-1234-5678-1234-567812345678'),
data={
'first_name': 'Georges',
'last_name': 'Abitbol',
},
validity_start=datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
validity_end=datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
)
endpoint = generic_endpoint_url('qrcode', 'get-qrcode', slug=connector.slug)
response = app.get(f'{endpoint}/{certificate.uuid}')
assert response.headers['Content-Type'] == 'image/png'
with open('tests/data/qrcode/test-qrcode.png', 'rb') as expected_qrcode:
# just check images are the same. Decoded content is tested javascript-side.
assert response.body == expected_qrcode.read()
def test_save_reader(app, connector):
endpoint = generic_endpoint_url('qrcode', 'save-reader', slug=connector.slug)
result = app.post_json(
endpoint,
params={
'validity_start': '2022-01-01 10:00:00+00:00',
'validity_end': '2023-01-01 10:00:00+00:00',
},
)
assert result.json['err'] == 0
reader_uuid = result.json['data']['uuid']
assert result.json['data']['url'] == f'http://testserver/qrcode/test/open-reader/{reader_uuid}'
reader = connector.readers.get(uuid=reader_uuid)
assert reader.validity_start == datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
assert reader.validity_end == datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
result = app.post_json(
f'{endpoint}/{reader_uuid}',
params={
'validity_start': '2024-01-01T10:00:00+00:00',
'validity_end': '2025-01-01T10:00:00+00:00',
},
)
reader.refresh_from_db()
assert reader.validity_start == datetime.datetime(2024, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
assert reader.validity_end == datetime.datetime(2025, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc)
def test_get_reader(app, connector):
reader = connector.readers.create(
validity_start=datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
validity_end=datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
)
endpoint = generic_endpoint_url('qrcode', 'get-reader', slug=connector.slug)
result = app.get(f'{endpoint}/{reader.uuid}')
assert result.json == {
'err': 0,
'data': {
'uuid': str(reader.uuid),
'validity_start': '2022-01-01T10:00:00+00:00',
'validity_end': '2023-01-01T10:00:00+00:00',
'url': f'http://testserver/qrcode/test/open-reader/{reader.uuid}',
},
}
def test_open_reader(app, connector, freezer):
reader = connector.readers.create(
validity_start=datetime.datetime(2022, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
validity_end=datetime.datetime(2023, 1, 1, 10, 0, 0, 0, tzinfo=timezone.utc),
)
endpoint = generic_endpoint_url('qrcode', 'open-reader', slug=connector.slug)
freezer.move_to('2022-01-01T09:59:59')
result = app.get(f'{endpoint}/{reader.uuid}')
assert 'Reader isn\'t usable yet' in result.body.decode('utf-8')
freezer.move_to('2022-01-01T10:00:00')
result = app.get(f'{endpoint}/{reader.uuid}')
assert result.pyquery(f'qrcode-reader[verify-key="{connector.hex_verify_key}"]')
freezer.move_to('2023-01-01T10:00:01')
result = app.get(f'{endpoint}/{reader.uuid}')
assert 'Reader has expired.' in result.body.decode('utf-8')
MISSING = object()
@pytest.mark.parametrize('value', [MISSING, None, ''], ids=['missing', 'null', 'empty string'])
class TestOptional:
def test_certificate_validity_start(self, value, app, connector):
params = {
'data': {
'first_name': 'Georges',
'last_name': 'Abitbol',
},
'validity_end': '2023-01-01 10:00:00+00:00',
}
if value is not MISSING:
params['validity_start'] = value
app.post_json('/qrcode/test/save-certificate/', params=params)
assert Certificate.objects.get().validity_start is None
def test_certificate_validity_end(self, value, app, connector):
params = {
'data': {
'first_name': 'Georges',
'last_name': 'Abitbol',
},
'validity_start': '2023-01-01 10:00:00+00:00',
}
if value is not MISSING:
params['validity_end'] = value
app.post_json('/qrcode/test/save-certificate/', params=params)
assert Certificate.objects.get().validity_end is None
def test_reader_validity_start(self, value, app, connector):
params = {
'validity_end': '2023-01-01 10:00:00+00:00',
}
if value is not MISSING:
params['validity_start'] = value
app.post_json('/qrcode/test/save-reader/', params=params)
assert Reader.objects.get().validity_start is None
def test_reader_validity_end(self, value, app, connector):
params = {
'validity_start': '2023-01-01 10:00:00+00:00',
}
if value is not MISSING:
params['validity_end'] = value
app.post_json('/qrcode/test/save-reader/', params=params)
assert Reader.objects.get().validity_end is None

View File

@ -325,6 +325,14 @@ def test_resource_certificates(mocked_get, caplog, endpoint_response):
assert mocked_get.call_args[1].get('verify') is True
assert 'cert' not in mocked_get.call_args[1]
with override_settings(REQUESTS_IGNORE_HTTPS_CERTIFICATE_ERRORS=['example.com']):
request.get('http://example.net/whatever')
assert mocked_get.call_args[1].get('verify') is True
with override_settings(REQUESTS_IGNORE_HTTPS_CERTIFICATE_ERRORS=['example.net']):
request.get('http://example.net/whatever')
assert mocked_get.call_args[1].get('verify') is False
resource.verify_cert = False
request.get('http://example.net/whatever')
assert mocked_get.call_args[1].get('verify') is False

View File

@ -90,8 +90,43 @@ def test_get_demande_logement_does_not_exist(app, connector, settings):
)
resp = app.get('/sne/test/get-demande-logement?demand_id=0690221008931G3164')
json_resp = resp.json
assert json_resp['err'] == 1
assert json_resp['err'] == 0
assert json_resp['err_desc'] == "La demande de logement n'existe pas dans le système."
def test_get_demande_logement_bad_guichet(app, connector, settings):
with responses.RequestsMock() as rsps:
setup_(rsps, settings)
with open('%s/tests/data/sne/response_mauvais_guichet' % os.getcwd(), 'rb') as f:
rsps.post(
'https://sne-ws-2.site-ecole.din.developpement-durable.gouv.invalid/services/DemandeLogementImplService',
status=200,
body=f.read(),
content_type='multipart/related; start="<rootpart*7902e9bd-21a8-4632-8760-d79a67eb89a1@example.jaxws.sun.com>"; type="application/xop+xml";'
' boundary="uuid:7902e9bd-21a8-4632-8760-d79a67eb89a1"; start-info="application/soap+xml"',
)
resp = app.get('/sne/test/get-demande-logement?demand_id=0690221008931G3164')
json_resp = resp.json
assert json_resp['err'] == 0
assert (
json_resp['data']['soap_fault']['message']
== "La demande de logement n'existe pas dans le système."
json_resp['err_desc']
== 'Votre guichet enregistreur ne couvre pas au moins une des communes souhaitées de la demande de logement.'
)
def test_get_demande_logement_missing(app, connector, settings):
with responses.RequestsMock() as rsps:
resp = app.get('/sne/test/get-demande-logement?demand_id=')
json_resp = resp.json
assert json_resp['err'] == 0
assert json_resp['err_desc'] == 'demand_id must contains 18 characters'
assert len(rsps.calls) == 0
def test_get_demande_logement_bad_length(app, connector, settings):
with responses.RequestsMock() as rsps:
resp = app.get('/sne/test/get-demande-logement?demand_id=1234')
json_resp = resp.json
assert json_resp['err'] == 0
assert json_resp['err_desc'] == 'demand_id must contains 18 characters'
assert len(rsps.calls) == 0

View File

@ -17,9 +17,11 @@ import base64
import urllib.parse
import pytest
import responses
from webtest import Upload
from passerelle.apps.soap.models import SOAPConnector
from tests.test_manager import login
from . import utils
@ -109,8 +111,10 @@ xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
</soap:Body>
</soap:Envelope>'''
INPUT_SCHEMA = {
'$anchor': 'ref-6adf97f83acf6453d4a6a4b1070f3754',
'properties': {
'firstName': {
'$anchor': 'ref-dbd3a37522045c54032a5b96864a500d',
'description': '{http://www.examples.com/wsdl/HelloService.wsdl}firstName',
'properties': {
'string': {'items': {'type': 'string', 'description': 'xsd:string'}, 'type': 'array'},
@ -124,6 +128,7 @@ xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
'type': 'object',
}
OUTPUT_SCHEMA = {
'$anchor': 'ref-6adf97f83acf6453d4a6a4b1070f3754',
'properties': {
'greeting': {'type': 'string', 'description': 'xsd:string'},
'who': {'type': 'string', 'description': 'xsd:string'},
@ -157,6 +162,11 @@ class SOAP12(SOAP11):
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:tns="urn:examples:helloservice"
targetNamespace="urn:examples:helloservice">
<xsd:complexType name="recurse">
<xsd:sequence>
<xsd:element name="anotherme" type="tns:recurse" minOccurs="0"/>
</xsd:sequence>
</xsd:complexType>
<xsd:element name="sayHello">
<xsd:complexType>
<xsd:sequence>
@ -170,6 +180,7 @@ class SOAP12(SOAP11):
<xsd:sequence>
<xsd:element name="greeting" type="xsd:string"/>
<xsd:element name="who" type="xsd:string" maxOccurs="unbounded"/>
<xsd:element name="recursion" type="tns:recurse" minOccurs="0"/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
@ -223,10 +234,12 @@ class SOAP12(SOAP11):
<sayHelloResponse xmlns="urn:examples:helloservice">
<greeting>Hello</greeting>
<who>John!</who>
<recursion><anotherme/></recursion>
</sayHelloResponse>
</soap:Body>
</soap:Envelope>'''
INPUT_SCHEMA = {
'$anchor': 'ref-5b712371a9c9bf61f983831c2ed3f364',
'type': 'object',
'properties': {
'firstName': {'type': 'array', 'items': {'type': 'string', 'description': 'xsd:string'}},
@ -236,6 +249,7 @@ class SOAP12(SOAP11):
'description': '{urn:examples:helloservice}sayHello',
}
OUTPUT_SCHEMA = {
'$anchor': 'ref-5e505e086d14d5417f2799da5c085712',
'description': '{urn:examples:helloservice}sayHelloResponse',
'properties': {
'greeting': {'type': 'string', 'description': 'xsd:string'},
@ -243,6 +257,16 @@ class SOAP12(SOAP11):
'type': 'array',
'items': {'type': 'string', 'description': 'xsd:string'},
},
'recursion': {
'$anchor': 'ref-63d3d62358d2daf62cd2ebd07640165e',
'description': '{urn:examples:helloservice}recurse',
'type': 'object',
'properties': {
'anotherme': {
'$ref': '#ref-63d3d62358d2daf62cd2ebd07640165e',
}
},
},
},
'required': ['greeting', 'who'],
'type': 'object',
@ -255,6 +279,7 @@ class SOAP12(SOAP11):
OUTPUT_DATA = {
'greeting': 'Hello',
'who': ['John!'],
'recursion': {'anotherme': None},
}
VALIDATION_ERROR = 'Expected at least 1 items (minOccurs check) 0 items found. (sayHello.firstName)'
@ -275,8 +300,6 @@ def soap(request):
class TestManage:
@pytest.fixture
def app(self, app, admin_user):
from .test_manager import login
login(app)
return app
@ -405,3 +428,32 @@ class TestAuthencation:
app.post_json('/soap/test/method/sayHello/', params=soap.INPUT_DATA)
assert 'Authorization' not in soap.endpoint_mock.handlers[0].call['requests'][1].headers
assert b'wsse:UsernameToken' in soap.endpoint_mock.handlers[0].call['requests'][1].body
def test_status_down_then_up(db, app, admin_user, monkeypatch):
class MockCache:
def add(self, *args, **kwargs):
pass
def get(self, *args, **kwargs):
pass
import passerelle.utils.soap
monkeypatch.setattr(passerelle.utils.soap, 'InMemoryCache', MockCache)
app = login(app)
broken_wsdl_content = SOAP11.WSDL_CONTENT[:100]
conn = SOAPConnector.objects.create(
slug='test', wsdl_url=SOAP11.WSDL_URL, zeep_strict=True, zeep_xsd_ignore_sequence_order=False
)
with responses.RequestsMock() as rsps:
rsps.get(SOAP11.WSDL_URL, status=200, body=broken_wsdl_content)
app.get('/soap/test/')
assert conn.get_availability_status().status == 'down'
with responses.RequestsMock() as rsps:
rsps.get(SOAP11.WSDL_URL, status=200, body=SOAP11.WSDL_CONTENT)
app.get('/soap/test/')
assert conn.get_availability_status().status == 'up'

View File

@ -15,6 +15,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import inspect
import re
from django.apps import apps
from django.utils import translation
@ -99,3 +100,29 @@ def test_render_oneof_property_required():
],
}
assert "<b>oneOf</b> [ <em>required 'a'</em> | <em>required 'b'</em> ]" in render_json_schema(schema)
def test_render_json_schema_anchor():
SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['foo'],
'additionalProperties': False,
'properties': {
'foo': {
'type': 'object',
'$anchor': 'foo',
'title': 'foo object',
'properties': {'a': {'type': 'string'}},
},
'zorglub': {
'$ref': '#foo',
},
},
}
# Check that no unicode crash occurs
with translation.override('fr'):
fragment = render_json_schema(SCHEMA)
match = re.search(r'id="(schema-object-foo-[^"]+)"', fragment)
assert match
assert f'href="#{match.group(1)}">foo object</a>' in fragment

View File

@ -15,9 +15,11 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import base64
import cgi
import json
import os
import urllib.parse
from io import BytesIO
import httmock
import pytest
@ -35,11 +37,6 @@ def get_json_content(name):
HTTP_MOCKS = {
'type-emploi': {
'path': r'^/.*/data/type_emploi$',
'query': 'viewIntegrationName=api_publik&count=200',
'content': get_json_content('type_emploi'),
},
'origine-candidature': {
'path': r'^/.*/data/origine_candidature$',
'query': 'viewIntegrationName=api_publik&count=200',
@ -87,7 +84,7 @@ HTTP_MOCKS = {
},
'annonce': {
'path': r'^.*/data/annonce$',
'query': 'viewIntegrationName=api_publik',
'query': 'viewIntegrationName=api_publik&count=200',
'content': get_json_content('annonce'),
},
'pdf': {
@ -248,7 +245,6 @@ class TestEndpoints:
'civilite',
'nationalite',
'situation-actuelle',
'type-emploi',
'domaine-emploi',
'sous-domaine-emploi',
'emploi',
@ -279,7 +275,7 @@ class TestEndpoints:
response = app.get('/toulouse-foederis/foederis/announce/')
content = response.json
assert content['err'] == 0
assert len(content['data_sources']) == 10
assert len(content['data_sources']) == 9
data = content['data']
assert len(data) == 5
@ -440,7 +436,7 @@ class TestEndpoints:
'complement_information_candidature': 'I need money.',
'R1261279': 1561049,
'accord_RGPD': True,
'R1249707': [157193, 157194],
'type_emploi_souhaite': 'Emploi saisonnier',
'R60845221': [5776395, 5776396],
'R60845244': [5776394, 5776395],
'R15017962': [],
@ -481,7 +477,7 @@ class TestEndpoints:
'internship_duration': '2h',
'job_families': ['5776394', '5776395'],
'job_realms': ['5776395', '5776396'],
'job_types': ['157193', '157194'],
'job_types': 'Emploi saisonnier',
'last_course_taken': 'Terminale',
'last_name': 'Doe',
'last_obtained_diploma': 'BAC',
@ -528,7 +524,7 @@ class TestEndpoints:
)
expected_payload.update(
{'annonce': int(announce_id), 'candidature_spontane': 'N', 'R14848258': 'ID_OFFRE'}
{'annonce': int(announce_id), 'candidature_spontane': 'N', 'R14846954': 'ID_OFFRE'}
)
external_id = f'announce-{announce_id}'
@ -558,6 +554,7 @@ class TestEndpoints:
)
assert response.json['data']['application_id'] == 42
def test_create_application_phone_error(self, resource, app):
response = app.post_json(
'/toulouse-foederis/foederis/create-application',
params={'phone': 'mille sabords'},
@ -565,16 +562,36 @@ class TestEndpoints:
assert response.json['err'] == 1
assert 'Couldn\'t recognize provided phone number' in response.json['err_desc']
def test_create_application_empty_phone(self, resource, app):
@httmock.urlmatch(path=r'^.*/data/candidature$')
def handler(url, request):
payload = json.loads(request.body)
assert 'telephone' not in payload
return httmock.response(200, json.dumps({'code': 200, 'results': [{'id': 42}]}))
with httmock.HTTMock(handler):
response = app.post_json(
'/toulouse-foederis/foederis/create-application',
params={'phone': ''},
)
assert response.json['err'] == 0
assert response.json['data']['application_id'] == 42
def test_attach_file(self, resource, app):
@httmock.urlmatch(path=r'^.*/data/candidature/424242/fields/cv$')
def handler(url, request):
assert request.headers['content-type'] == 'application/json'
assert request.headers['content-type'].startswith('multipart/form-data')
assert request.headers['api-key'] == APIKEY
payload = json.loads(request.body)
_, headers = cgi.parse_header(request.headers['content-type'])
headers['boundary'] = bytes(headers['boundary'], 'utf-8')
headers['CONTENT-LENGTH'] = request.headers['Content-Length']
payload = cgi.parse_multipart(BytesIO(request.body), headers)
assert payload == {
'contentType': 'application/pdf',
'value': 'base 64 content',
'fileName': 'cv.pdf',
'contentType': ['application/pdf'],
'value': ['base 64 content'],
'fileName': ['cv.pdf'],
}
return httmock.response(200, json.dumps({'code': 200, 'results': ['Field updated']}))
@ -608,19 +625,25 @@ class TestEndpoints:
assert payload == {
'intitule_diplome': 'DUT anarchisme',
'R1258215': '424242',
'R79264997': '9000',
}
return httmock.response(200, json.dumps({'code': 200, 'results': [{'id': 'DEGREE_ID'}]}))
@httmock.urlmatch(path=r'^.*/data/diplome2/DEGREE_ID/fields/justificatif_diplome$')
def degree_file_handler(url, request):
assert request.headers['content-type'] == 'application/json'
assert request.headers['content-type'].startswith('multipart/form-data')
assert request.headers['api-key'] == APIKEY
payload = json.loads(request.body)
_, headers = cgi.parse_header(request.headers['content-type'])
headers['boundary'] = bytes(headers['boundary'], 'utf-8')
headers['CONTENT-LENGTH'] = request.headers['Content-Length']
payload = cgi.parse_multipart(BytesIO(request.body), headers)
assert payload == {
'contentType': 'application/pdf',
'value': 'base 64 content',
'fileName': 'cv.pdf',
'contentType': ['application/pdf'],
'value': ['base 64 content'],
'fileName': ['cv.pdf'],
}
return httmock.response(200, json.dumps({'code': 200, 'results': [{'id': 'DEGREE_ID'}]}))
@ -635,6 +658,7 @@ class TestEndpoints:
params={
'application_id': '424242',
'name': 'DUT anarchisme',
'degree_level': '9000',
'file': {
'content_type': 'application/pdf',
'content': 'base 64 content',
@ -643,7 +667,20 @@ class TestEndpoints:
},
)
assert response.json['err'] == 0
assert response.json['err'] == 0
with httmock.HTTMock(create_degree_handler, degree_file_handler, error_handler):
response = app.post_json(
'/toulouse-foederis/foederis/attach-degree',
params={
'application_id': '424242',
'name': 'DUT anarchisme',
'degree_level': '9000',
'file': None,
},
)
assert response.json['err'] == 0
def test_migration_0003_no_null_no_charfield(migration):

View File

@ -80,7 +80,7 @@ def get_endpoint(name):
@pytest.fixture
def requests_mock():
return responses.RequestsMock()
return responses.RequestsMock(assert_all_requests_are_fired=False)
@pytest.fixture()
@ -132,11 +132,12 @@ def site_service(requests_mock):
@pytest.fixture()
def ape_service():
def ape_service(requests_mock):
with ResponsesSoap(
wsdl_url='https://example.org/ApeService?wsdl',
wsdl_content=get_wsdl_file('ApeService.wsdl'),
settings=Settings(strict=False, xsd_ignore_sequence_order=True),
requests_mock=requests_mock,
)() as mock:
yield mock
@ -156,6 +157,20 @@ def wcs_service(settings, requests_mock):
yield mock
@pytest.fixture
def authentic_service(settings, requests_mock):
service = {
'idp': {
'url': 'http://idp.example.org/',
'verif_orig': 'abc',
'secret': 'def',
},
}
settings.KNOWN_SERVICES = {'authentic': service}
with requests_mock as mock:
yield mock
@pytest.fixture(scope='module')
def django_db_setup(django_db_setup, django_db_blocker):
with django_db_blocker.unblock():
@ -246,6 +261,9 @@ def django_db_setup(django_db_setup, django_db_blocker):
activity_mock.add_soap_response(
'readServiceList', get_xml_file('R_read_service_list.xml')
)
activity_mock.add_soap_response(
'readActivityList', get_xml_file('R_read_activity_list.xml')
)
con.update_activity_referentials()
with ape_service() as ape_mock:
@ -343,6 +361,7 @@ def test_manager_update_referentials(admin_user, app, con):
assert job.method_name == 'update_referentials'
@mock.patch('passerelle.contrib.toulouse_maelis.models.ToulouseMaelis.soap_client_cache_timeout', 0)
@mock.patch('passerelle.utils.Request.get')
def test_call_with_wrong_wsdl_url(mocked_get, con):
mocked_get.side_effect = CONNECTION_ERROR
@ -356,6 +375,7 @@ def test_call_with_wrong_wsdl_url(mocked_get, con):
assert e.value.url == 'https://example.org/FamilyService?wsdl'
@mock.patch('passerelle.contrib.toulouse_maelis.models.ToulouseMaelis.soap_client_cache_timeout', 0)
@mock.patch('passerelle.utils.Request.get')
def test_call_with_wrong_wsdl_content(mocked_get, con):
mocked_get.return_value = TOMCAT_ERROR
@ -363,6 +383,7 @@ def test_call_with_wrong_wsdl_content(mocked_get, con):
con.call('Family', 'isWSRunning')
@mock.patch('passerelle.contrib.toulouse_maelis.models.ToulouseMaelis.soap_client_cache_timeout', 0)
@mock.patch('passerelle.utils.Request.get')
def test_call_with_wrong_wsdl_content_bis(mocked_get, con, app):
mocked_get.return_value = TOMCAT_ERROR
@ -373,6 +394,7 @@ def test_call_with_wrong_wsdl_content_bis(mocked_get, con, app):
assert "'NoneType' object has no attribute 'getroottree'" in resp.json['err_desc']
@mock.patch('passerelle.contrib.toulouse_maelis.models.ToulouseMaelis.soap_client_cache_timeout', 0)
@mock.patch('passerelle.utils.Request.get')
@mock.patch('passerelle.utils.Request.post')
def test_call_with_wrong_soap_content(mocked_post, mocked_get, con):
@ -551,6 +573,7 @@ def test_update_referential_empty(mocked_get, con):
def test_cron(db):
assert Referential.objects.filter(referential_name='Category').count() == 3
assert sorted(list({x.referential_name for x in Referential.objects.all()})) == [
'Activity',
'ActivityNatureType',
'ApeIndicator',
'CSP',
@ -626,7 +649,32 @@ def test_link(family_service, con, app):
resp = app.post_json(url + '?NameID=local', params=params)
assert Link.objects.count() == 1
assert resp.json['err'] == 1
assert resp.json['err_desc'] == "RL1 does not match '1312' family"
assert resp.json['err_desc'] == "Data provided does not match any RL on '1312' family"
def test_link_rl2(family_service, con, app):
def request_check(request):
assert request.dossierNumber == 1312
family_service.add_soap_response(
'readFamily', get_xml_file('R_read_family.xml'), request_check=request_check
)
url = get_endpoint('link')
assert Link.objects.count() == 0
# skip caching invoice
con.referential.filter(referential_name='Regie').delete()
params = {
'family_id': '1312',
'firstname': 'Jane',
'lastname': 'Doe',
'dateBirth': '1940-06-22',
}
resp = app.post_json(url + '?NameID=local', params=params)
assert Link.objects.count() == 1
assert resp.json['err'] == 0
assert resp.json['data'] == 'ok'
@mock.patch('passerelle.utils.Request.get')
@ -672,7 +720,7 @@ def test_link_additional_properties_error(con, app):
assert resp.json['err_desc'] == "Additional properties are not allowed ('plop' was unexpected)"
def test_link_family_with_no_birth_error(family_service, con, app):
def test_link_family_with_no_birth_error(family_service, con, app, caplog):
family_service.add_soap_response('readFamily', get_xml_file('R_read_family_no_rl1_birth.xml'))
url = get_endpoint('link')
@ -683,8 +731,30 @@ def test_link_family_with_no_birth_error(family_service, con, app):
'dateBirth': '1938-07-26',
}
resp = app.post_json(url + '?NameID=local', params=params)
assert len(caplog.records) == 4
assert caplog.records[2].levelno == logging.WARNING
assert caplog.records[2].message == "Maelis provides an invalid dateBirth for RL1 on '1312' family"
assert resp.json['err'] == 1
assert resp.json['err_desc'] == "Maelis provides an invalid dateBirth for RL1 on '1312' family"
assert resp.json['err_desc'] == "Data provided does not match any RL on '1312' family"
def test_link_rl2_error(family_service, con, app):
def request_check(request):
assert request.dossierNumber == 1312
family_service.add_soap_response(
'readFamily', get_xml_file('R_read_family_with_only_rl1.xml'), request_check=request_check
)
url = get_endpoint('link')
params = {
'family_id': '1312',
'firstname': 'Jane',
'lastname': 'Doe',
'dateBirth': '1940-06-22',
}
resp = app.post_json(url + '?NameID=local', params=params)
assert resp.json['err'] == 1
assert resp.json['err_desc'] == "Data provided does not match any RL on '1312' family"
def test_unlink(con, app):
@ -701,6 +771,130 @@ def test_unlink(con, app):
assert resp.json['err_desc'] == 'User not linked to family'
def test_get_link_list(con, app, authentic_service, freezer):
authentic_service.add(
responses.GET,
'http://idp.example.org/api/users/83f6e19feb2043d2aafb041aea445b2c/',
json={
'uuid': '83f6e19feb2043d2aafb041aea445b2c',
'username': 'jdoe',
'first_name': 'Jhon',
'last_name': 'Doe',
'email': 'jdoe@example.org',
'date_joined': '2020-04-06T19:00:00.000000+02:00',
'last_login': '2023-07-10T11:00:00.000000+02:00',
'password': 'XXX',
},
status=200,
)
authentic_service.add(
responses.GET,
'http://idp.example.org/api/users/local/',
json={'result': 0, 'errors': {'detail': 'Pas trouvé.'}},
status=404,
)
authentic_service.add(
responses.GET,
'http://idp.example.org/api/users/456/',
body=CONNECTION_ERROR,
)
url = get_endpoint('get-link-list')
# link 3 time to the 1312 family
freezer.move_to('2023-07-10 15:00:00')
Link.objects.create(resource=con, family_id='1312', name_id='83f6e19feb2043d2aafb041aea445b2c')
freezer.move_to('2023-07-10 16:00:00')
Link.objects.create(resource=con, family_id='1312', name_id='local')
Link.objects.create(resource=con, family_id='1312', name_id='456')
assert Link.objects.count() == 3
resp = app.get(url + '?family_id=1312')
assert len(authentic_service.calls) == 3
assert resp.json['err'] == 0
assert resp.json['data'] == [
{
'id': '83f6e19feb2043d2aafb041aea445b2c',
'context': {
'link': {
'name_id': '83f6e19feb2043d2aafb041aea445b2c',
'family_id': '1312',
'created': '2023-07-10T15:00:00Z',
'updated': '2023-07-10T15:00:00Z',
},
'user': {
'uuid': '83f6e19feb2043d2aafb041aea445b2c',
'username': 'jdoe',
'first_name': 'Jhon',
'last_name': 'Doe',
'email': 'jdoe@example.org',
'date_joined': '2020-04-06T19:00:00+02:00',
'last_login': '2023-07-10T11:00:00+02:00',
},
},
'text': 'Jhon Doe <jdoe@example.org> (lié le 10/07/2023 ; compte créé le 06/04/2020, dernière connexion le 10/07/2023)',
},
{
'id': 'local',
'context': {
'link': {
'name_id': 'local',
'family_id': '1312',
'created': '2023-07-10T16:00:00Z',
'updated': '2023-07-10T16:00:00Z',
}
},
'text': 'local (lié le 10/07/2023)',
},
{
'id': '456',
'context': {
'link': {
'name_id': '456',
'family_id': '1312',
'created': '2023-07-10T16:00:00Z',
'updated': '2023-07-10T16:00:00Z',
}
},
'text': '456 (lié le 10/07/2023)',
},
]
resp = app.get(url + '?family_id=plop')
assert resp.json['err'] == 0
assert resp.json['data'] == []
resp = app.get(url)
assert resp.json['err'] == 1
assert resp.json['err_desc'] == 'User not linked to family'
resp = app.get(url + '?NameID=local')
assert resp.json['err'] == 0
assert [x['text'][:50] for x in resp.json['data']] == [
'Jhon Doe <jdoe@example.org> (lié le 10/07/2023 ; c',
'local (lié le 10/07/2023)',
'456 (lié le 10/07/2023)',
]
def test_get_link_list_service_error(con, app, freezer):
url = get_endpoint('get-link-list')
freezer.move_to('2023-07-10 15:00:00')
Link.objects.create(resource=con, family_id='1312', name_id='83f6e19feb2043d2aafb041aea445b2c')
freezer.move_to('2023-07-10 16:00:00')
Link.objects.create(resource=con, family_id='1312', name_id='local')
Link.objects.create(resource=con, family_id='1312', name_id='456')
assert Link.objects.count() == 3
resp = app.get(url + '?family_id=1312')
assert resp.json['err'] == 0
assert [x['text'][:50] for x in resp.json['data']] == [
'83f6e19feb2043d2aafb041aea445b2c (lié le 10/07/202',
'local (lié le 10/07/2023)',
'456 (lié le 10/07/2023)',
]
def test_get_referential(con):
assert con.get_referential('Category') == [
{'code': 'BI', 'id': 'BI', 'libelle': 'BIPARENTALE', 'text': 'BIPARENTALE'},
@ -4922,32 +5116,52 @@ def test_get_rl1_direct_debit_order_soap_error(family_service, invoice_service,
)
def test_read_school_year_list(con, app):
def test_read_school_year_list(con, app, freezer):
url = get_endpoint('read-school-years-list')
resp = app.get(url)
freezer.move_to('2023-11-09')
resp = app.get(url + '?subscribable=0')
assert resp.json['err'] == 0
assert len(resp.json['data']) == 2
assert len(resp.json['data']) == 3
assert resp.json['data'] == [
{
'id': 2022,
'text': '2022',
'schoolYear': 2022,
'dateStartYearSchool': '2022-09-01T00:00:00+02:00',
'dateEndYearSchool': '2023-07-07T00:00:00+02:00',
'dateStartSubscribeSchool': '2022-04-01T00:00:00+02:00',
'dateEndSubscribeSchool': '2023-07-08T00:00:00+02:00',
'dateStartYearSchool': '2022-09-01T00:00:00+02:00',
'dateEndSubscribeSchool': '2023-09-01T00:00:00+02:00',
'dateStartSubscribeSchool': '2022-09-01T00:00:00+02:00',
},
{
'id': 2023,
'text': '2023',
'schoolYear': 2023,
'dateStartYearSchool': '2023-09-01T00:00:00+02:00',
'dateEndYearSchool': '2024-07-07T00:00:00+02:00',
'dateStartSubscribeSchool': '2022-12-01T00:00:00+01:00',
'dateEndSubscribeSchool': '2023-07-08T00:00:00+02:00',
'dateStartYearSchool': '2023-09-04T00:00:00+02:00',
'dateEndSubscribeSchool': '2024-07-01T00:00:00+02:00',
'dateStartSubscribeSchool': '2022-09-01T00:00:00+02:00',
},
{
'id': 2024,
'text': '2024',
'schoolYear': 2024,
'dateEndYearSchool': '2025-07-07T00:00:00+02:00',
'dateStartYearSchool': '2024-09-01T00:00:00+02:00',
'dateEndSubscribeSchool': None,
'dateStartSubscribeSchool': None,
},
]
# get only subscribable school years by default
resp = app.get(url)
assert resp.json['err'] == 0
assert [x['text'] for x in resp.json['data']] == ['2023']
resp = app.get(url + '?subscribable=plop')
assert resp.json['err'] == 1
assert resp.json['err_desc'] == "invalid truth value 'plop'"
def test_read_school_levels_list(con, app):
url = get_endpoint('read-school-levels-list')
@ -6211,6 +6425,47 @@ def test_create_child_school_pre_registration(family_service, con, app):
assert resp.json['data']['subscribeSchoolBean']['isWaitList']
def test_create_child_school_pre_registration_soap_error(family_service, con, app):
family_service.add_soap_response(
'preSubscribeSchoolPerim', get_xml_file('R_create_child_school_pre_registration_soap_error.xml')
)
url = get_endpoint('create-child-school-pre-registration')
resp = app.post_json(
url,
params={
'numPerson': '248460',
'schoolYear': '2023',
'levelCode': 'CM1',
'dateSubscribe': '2023-09-01T00:00:00+02:00',
},
)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.soap.SOAPFault'
assert (
resp.json['err_desc']
== 'SOAP service at https://example.org/FamilyService?wsdl returned an error "E25 : Cette personne nappartient pas à cette famille"'
)
def test_create_child_school_pre_registration_maelis_error(family_service, con, app):
family_service.add_soap_response(
'preSubscribeSchoolPerim', get_xml_file('R_create_child_school_pre_registration_maelis_error.xml')
)
url = get_endpoint('create-child-school-pre-registration')
resp = app.post_json(
url,
params={
'numPerson': '248460',
'schoolYear': '2023',
'levelCode': 'CM1',
'dateSubscribe': '2023-09-01T00:00:00+02:00',
},
)
assert resp.json['err'] == 1
assert resp.json['err_class'] == 'passerelle.utils.jsonresponse.APIError'
assert resp.json['err_desc'] == 'E113 : Il existe déjà une inscription scolaire pour cet enfant'
def test_create_child_school_pre_registration_with_exemption(family_service, con, app):
family_service.add_soap_response(
'presubscribeSchoolDerog', get_xml_file('R_create_child_school_pre_registration_with_exemption.xml')
@ -6300,25 +6555,17 @@ def test_get_public_criterias(start_dob, end_dob, expected):
assert expected == [x[1] for x in result]
def test_read_activity_list(activity_service, con, app):
def request_check(request):
assert request.schoolyear == 1970
assert request.dateStartCalend == datetime.datetime(2023, 3, 1, 0, 0)
assert request.dateEndCalend == datetime.datetime(2025, 2, 28, 0, 0)
activity_service.add_soap_response(
'readActivityList',
get_xml_file('R_read_activity_list.xml'),
request_check=request_check,
)
def test_read_activity_list(con, app, freezer):
url = get_endpoint('read-activity-list')
con.loisir_nature_codes = '1,4,L,, S '
con.save()
params = {'ref_date': '2024-02-29'}
resp = app.get(url, params=params)
freezer.move_to('2024-02-29')
resp = app.get(url)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 8
activity_text = [x['activity']['text'] for x in resp.json['data']]
assert activity_text == sorted(activity_text)
assert [
(
x['id'],
@ -6327,16 +6574,16 @@ def test_read_activity_list(activity_service, con, app):
)
for x in resp.json['data']
] == [
('A10056517594-A10056517595-A10056517597', 'plop', None),
('A10056514645-A10056514650-A10053179757', None, None),
('A10056514645-A10056514648-A10053179876', None, None),
('A10056514645-A10056514649-A10053179757', None, None),
('A10051141965-A10051141966-A10053179226', 'A10049329051', 'Sorties'),
('A10051141965-A10051141968-A10053179226', 'A10049329051', 'Sorties'),
('A10051141965-A10051141970-A10053179226', 'A10049329051', 'Sorties'),
('A10051141965-A10051141990-A10053179227', 'A10049329051', 'Sorties'),
('A10056514645-A10056514650-A10053179757', None, None),
('A10056514645-A10056514648-A10053179876', None, None),
('A10056514645-A10056514649-A10053179757', None, None),
('A10056517594-A10056517595-A10056517597', 'plop', None),
]
item = resp.json['data'][0]
item = resp.json['data'][4]
item['activity'] = 'N/A'
item['unit'] = 'N/A'
item['place'] = 'N/A'
@ -6356,7 +6603,7 @@ def test_read_activity_list(activity_service, con, app):
},
'nature': {'text': "Nature de l'activité", 'data': {'4': 'ART PLASTIQUE'}, 'order': ['4']},
'type': {
'text': "Type de l'activité",
'text': 'Discipline',
'data': {'activite-reguliere': 'ACTIVITE REGULIERE'},
'order': ['activite-reguliere'],
},
@ -6381,7 +6628,6 @@ def test_read_activity_list(activity_service, con, app):
},
}
assert resp.json['meta'] == {
'ref_date': '2024-02-29',
'all_criterias': {
'service': {'text': 'Service', 'data': {'sorties': 'Sorties'}, 'order': ['sorties']},
'nature': {
@ -6390,7 +6636,7 @@ def test_read_activity_list(activity_service, con, app):
'order': ['1', '4'],
},
'type': {
'text': "Type de l'activité",
'text': 'Discipline',
'data': {
'activite-reguliere': 'ACTIVITE REGULIERE',
'activites-aquatiques-activite-reguliere': 'Activités Aquatiques Activité Réguliére',
@ -6440,17 +6686,16 @@ def test_read_activity_list(activity_service, con, app):
con.loisir_nature_codes = 'X,L,S'
con.save()
resp = app.get(url, params=params)
resp = app.get(url)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 0
assert resp.json == {
'data': [],
'meta': {
'ref_date': '2024-02-29',
'all_criterias': {
'service': {'text': 'Service', 'data': {}, 'order': []},
'nature': {'text': "Nature de l'activité", 'data': {}, 'order': []},
'type': {'text': "Type de l'activité", 'data': {}, 'order': []},
'type': {'text': 'Discipline', 'data': {}, 'order': []},
'public': {'text': 'Public', 'data': {}, 'order': []},
'day': {'text': 'Jours', 'data': {}, 'order': []},
'place': {'text': 'Lieu', 'data': {}, 'order': []},
@ -6461,22 +6706,20 @@ def test_read_activity_list(activity_service, con, app):
}
def test_read_activity_list_without_date(activity_service, con, app, freezer):
activity_service.add_soap_response('readActivityList', get_xml_file('R_read_activity_list.xml'))
url = get_endpoint('read-activity-list')
def test_read_activity_list_no_nature(activity_service, con, app, freezer):
def request_check(request):
assert request.schoolyear == 1970
assert request.dateStartCalend == datetime.datetime(2023, 3, 1, 0, 0)
assert request.dateEndCalend == datetime.datetime(2025, 2, 28, 0, 0)
activity_service.add_soap_response(
'readActivityList', get_xml_file('R_read_activity_list_no_nature.xml'), request_check=request_check
)
url = get_endpoint('read-activity-list')
freezer.move_to('2024-02-29')
con.update_catalog_referential()
resp = app.get(url)
assert resp.json['err'] == 0
assert resp.json['meta']['ref_date'] == '2024-02-29'
def test_read_activity_list_no_nature(activity_service, con, app):
activity_service.add_soap_response('readActivityList', get_xml_file('R_read_activity_list_no_nature.xml'))
url = get_endpoint('read-activity-list')
params = {'ref_date': '2023-01-01'}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 0
@ -6500,7 +6743,6 @@ def test_get_person_activity_list(activity_service, con, app):
'nature': '',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
@ -6582,7 +6824,7 @@ def test_get_person_activity_list(activity_service, con, app):
('A10053187065', 'Semaine 2'),
]
params['text_template'] = ''
del params['text_template']
params['type_ids'] = 'LOI_VAC,,'
resp = app.get(url, params=params)
assert resp.json['err'] == 0
@ -6617,7 +6859,6 @@ def test_get_person_activity_list_not_linked_error(con, app):
'nature': '',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 1
@ -6634,7 +6875,6 @@ def test_get_person_activity_list_date_error(con, app):
'nature': '',
'start_date': 'bad',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params, status=400)
assert resp.json['err'] == 1
@ -6677,7 +6917,6 @@ def test_get_person_unit_list(activity_service, con, app):
'activity_id': 'A10053187087',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
@ -6745,7 +6984,6 @@ def test_get_person_unit_list_not_linked_error(con, app):
'activity_id': 'A10053187087',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 1
@ -6762,7 +7000,6 @@ def test_get_person_unit_list_date_error(con, app):
'activity_id': 'A10053187087',
'start_date': 'bad',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params, status=400)
assert resp.json['err'] == 1
@ -6801,7 +7038,6 @@ def test_get_person_unit_list_no_activity_error(activity_service, con, app):
'activity_id': 'plop',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 1
@ -6827,7 +7063,6 @@ def test_get_person_place_list(activity_service, con, app):
'unit_id': 'A10053187241',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
@ -6884,7 +7119,6 @@ def test_get_person_place_list_not_linked_error(con, app):
'unit_id': 'A10053187241',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 1
@ -6902,7 +7136,6 @@ def test_get_person_place_list_date_error(con, app):
'unit_id': 'A10053187241',
'start_date': 'bad',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params, status=400)
assert resp.json['err'] == 1
@ -6942,7 +7175,6 @@ def test_get_person_place_list_no_unit_error(activity_service, con, app):
'unit_id': 'plop',
'start_date': '2022-09-01',
'end_date': '2023-08-31',
'text_template': '',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 1
@ -9972,7 +10204,6 @@ def test_cancel_basket_invoice_cron_having_for_payment_date(
request_check=request_check,
)
Link.objects.create(resource=con, family_id='1312', name_id='local')
assert con.cancel_invoice_delay == 30
assert con.max_payment_delay == 20
# invoice created on validate basket
@ -9986,32 +10217,46 @@ def test_cancel_basket_invoice_cron_having_for_payment_date(
assert invoice.basket_generation_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:30:00'
assert invoice.maelis_cancel_notification_date is None
# invoice is payable
resp = app.get(get_endpoint('regie/109/invoices') + '?family_id=1312')
assert resp.json['err'] == 0
assert '1312-18' in [x['id'] for x in resp.json['data']]
assert resp.json['data'][0]['online_payment'] is True
# notificate payment starts
freezer.move_to('2023-03-03 18:40:00')
resp = app.get(get_endpoint('regie/109/invoice/1312-18') + '?for_payment')
freezer.move_to('2023-03-03 18:35:00')
resp = app.get(get_endpoint('regie/109/invoice/1312-18') + '?payment')
assert resp.json['err'] == 0
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.start_payment_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:40:00'
assert invoice.start_payment_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:35:00'
# invoice is still displayed before cancellation order is sent to maelis
# (but no more payable)
con.cancel_basket_invoices()
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.status() == 'for_payment'
assert invoice.maelis_cancel_notification_date is None
# invoice is no more payable
resp = app.get(get_endpoint('regie/109/invoices') + '?family_id=1312')
assert resp.json['err'] == 0
assert '1312-18' in [x['id'] for x in resp.json['data']]
assert resp.json['data'][0]['online_payment'] is False
# start payment date is not updated on furter invoice call providing '?payment'
freezer.move_to('2023-03-03 18:40:00')
resp = app.get(get_endpoint('regie/109/invoice/1312-18') + '?payment')
assert resp.json['err'] == 0
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.start_payment_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:35:00'
# cancellation order is now sent to maelis
freezer.move_to('2023-03-03 19:10:00')
freezer.move_to('2023-03-03 18:55:00')
con.cancel_basket_invoices()
assert caplog.records[-1].levelno == logging.INFO
assert caplog.records[-1].message == 'Annulation de <Invoice "109/18"> sur la famille \'1312\''
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.status() == 'cancelled'
assert invoice.maelis_cancel_notification_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 19:10:00'
assert invoice.maelis_cancel_notification_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:55:00'
def test_cancel_basket_invoice_cron_keep_paid_invoices(
@ -10225,9 +10470,18 @@ def test_read_nursery_list(con, app):
{'idUnit': 'M10053212403', 'libelle': 'CC AMIDONNIERS - Occasionnels', 'typeAcc': 'OCCASIONAL'},
],
'idActivity': 'M10000000001',
'idService': 'A10049327627',
'activityType': {'code': 'CRECHCO', 'libelle': 'Crèche collective'},
}
resp = app.get(url, params={'service_ids': 'A10049329048,A10049327627'})
assert len(resp.json['data']) == 3
assert all(x['idService'] in ['A10049329048', 'A10049327627'] for x in resp.json['data'])
resp = app.get(url, params={'service_ids': 'A10049329048'})
assert len(resp.json['data']) == 2
assert all(x['idService'] == 'A10049329048' for x in resp.json['data'])
def test_get_nursery_geojson(con, app):
url = get_endpoint('get-nursery-geojson')
@ -10238,70 +10492,78 @@ def test_get_nursery_geojson(con, app):
}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
assert len(resp.json['features']) == 1
assert resp.json == {
'type': 'FeatureCollection',
'features': [
{
'type': 'Feature',
'geometry': {'coordinates': [1.430282, 43.606099], 'type': 'Point'},
'properties': {
'id': 'M10000000001:M10053212402:M10053212401',
'obs1': 'Quartier 1.2',
'obs2': 'Secteur 1',
'text': 'CC AMIDONNIERS - Réguliers',
'place': {
'address': {
'num': 29,
'town': 'TOULOUSE',
'street1': 'ALL DE BRIENNE',
'street2': None,
'zipcode': '31000',
'idStreet': None,
},
'idPlace': 'M10053212401',
'libelle': 'CC AMIDONNIERS',
'latitude': 43.606099,
'libelle2': None,
'longitude': 1.430282,
},
'libelle': 'CC AMIDONNIERS',
'libelle2': None,
'manager1': {
'phone': '0561615590',
'poste': 'CCAS',
'lastname': 'THOMAS',
'firstname': 'GUYLAINE',
},
'manager2': None,
'unitList': [
{
'idUnit': 'M10053212402',
'libelle': 'CC AMIDONNIERS - Réguliers',
'typeAcc': 'REGULAR',
},
{
'idUnit': 'M10053212403',
'libelle': 'CC AMIDONNIERS - Occasionnels',
'typeAcc': 'OCCASIONAL',
},
],
'idActivity': 'M10000000001',
'activityType': {'code': 'CRECHCO', 'libelle': 'Crèche collective'},
'activity_id': 'M10000000001',
'place_id': 'M10053212401',
'unit_id': 'M10053212402',
'unit': {
'idUnit': 'M10053212402',
'libelle': 'CC AMIDONNIERS - Réguliers',
'typeAcc': 'REGULAR',
},
assert resp.json['type'] == 'FeatureCollection'
assert len(resp.json['features']) == 6
assert resp.json['features'][0] == {
'type': 'Feature',
'geometry': {'coordinates': [1.430282, 43.606099], 'type': 'Point'},
'properties': {
'id': 'M10000000001:M10053212402:M10053212401',
'obs1': 'Quartier 1.2',
'obs2': 'Secteur 1',
'text': 'CC AMIDONNIERS - Réguliers',
'place': {
'address': {
'num': 29,
'town': 'TOULOUSE',
'street1': 'ALL DE BRIENNE',
'street2': None,
'zipcode': '31000',
'idStreet': None,
},
}
],
'err': 0,
'idPlace': 'M10053212401',
'libelle': 'CC AMIDONNIERS',
'latitude': 43.606099,
'libelle2': None,
'longitude': 1.430282,
},
'libelle': 'CC AMIDONNIERS',
'libelle2': None,
'manager1': {
'phone': '0561615590',
'poste': 'CCAS',
'lastname': 'THOMAS',
'firstname': 'GUYLAINE',
},
'manager2': None,
'unitList': [
{
'idUnit': 'M10053212402',
'libelle': 'CC AMIDONNIERS - Réguliers',
'typeAcc': 'REGULAR',
},
{
'idUnit': 'M10053212403',
'libelle': 'CC AMIDONNIERS - Occasionnels',
'typeAcc': 'OCCASIONAL',
},
],
'idActivity': 'M10000000001',
'idService': 'A10049327627',
'activityType': {'code': 'CRECHCO', 'libelle': 'Crèche collective'},
'activity_id': 'M10000000001',
'place_id': 'M10053212401',
'unit_id': 'M10053212402',
'unit': {
'idUnit': 'M10053212402',
'libelle': 'CC AMIDONNIERS - Réguliers',
'typeAcc': 'REGULAR',
},
},
}
params = {
'activity_type': 'CRECHFAM',
'code_psu': 'REGULAR',
'service_ids': 'A10049329048,A10049327627',
}
resp = app.get(url, params=params)
assert resp.json['err'] == 0
assert len(resp.json['features']) == 2
assert all(
x['properties']['idService'] in ['A10049329048', 'A10049327627'] for x in resp.json['features']
)
def test_create_nursery_demand(ape_service, con, app):
def request_check(request):
@ -10430,7 +10692,7 @@ def test_invoices(invoice_service, con, app, caplog, freezer):
invoice = con.invoice_set.get(regie_id=102, invoice_id=30)
assert invoice.created.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:00:00'
assert invoice.updated.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:00:00'
assert invoice.updated.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:10:00'
assert invoice.maelis_data_update_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:00:00'
assert invoice.status() == 'created'
@ -10553,6 +10815,7 @@ def test_invoices_history(invoice_service, con, app):
resp = app.get(url + '?NameID=local')
assert resp.json['err'] == 0
assert resp.json['has_invoice_for_payment'] is True
for invoice in resp.json['data']:
assert invoice['display_id']
assert invoice['label']
@ -10615,6 +10878,78 @@ def test_invoice(invoice_service, con, app):
assert resp.json['data']['label'] == 'CLAE JANVIER 2023'
@mock.patch('passerelle.utils.Request.get')
@mock.patch('passerelle.utils.Request.post')
def test_invoice_online_payment_no_response(mocked_post, mocked_get, con, app, db):
mocked_get.side_effect = (INVOICE_SERVICE_WSDL,)
mocked_post.side_effect = [
FakedResponse(content=get_xml_file('R_read_invoices.xml'), status_code=200),
ReadTimeout('timeout'),
ReadTimeout('timeout'),
]
url = get_endpoint('regie/102/invoice/1312-30')
# Assert we get the invoice in cache
resp = app.get(url + '?NameID=ignored')
assert resp.json['err'] == 0
assert resp.json['data']['online_payment'] is True
assert resp.json['data']['no_online_payment_reason'] is None
# Maelis is no more available
resp = app.get(url + '?NameID=ignored')
assert resp.json['err'] == 0
assert resp.json['data']['online_payment'] is False
assert resp.json['data']['no_online_payment_reason'] == 'Le service est temporairement indisponible.'
# No change on invoice already paid
url = get_endpoint('regie/102/invoice/1312-8')
resp = app.get(url + '?NameID=ignored')
assert resp.json['err'] == 0
assert resp.json['data']['online_payment'] is False
assert resp.json['data']['no_online_payment_reason'] is None
def test_invoices_online_payment_no_invoice(invoice_service, con, app, caplog, freezer):
invoice_service.add_soap_response('readInvoices', get_xml_file('R_read_invoices.xml'))
invoice_service.add_soap_response('readInvoices', get_xml_file('R_read_invoices_canceled.xml'))
invoice_service.add_soap_response('readInvoices', get_xml_file('R_read_invoices.xml'))
url = get_endpoint('regie/102/invoices') + '?NameID=local'
Link.objects.create(resource=con, family_id='1312', name_id='local')
# Assert we get the invoice in cache
freezer.move_to('2023-03-03 18:00:00')
resp = app.get(url)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 1
assert resp.json['data'][0]['online_payment'] is True
assert resp.json['data'][0]['no_online_payment_reason'] is None
invoice = con.invoice_set.get(regie_id=102, invoice_id=30)
assert invoice.maelis_no_more_returned_date is None
assert invoice.status() == 'created'
# Maelis do not send the "canceled by agents" invoice
freezer.move_to('2023-03-03 18:10:00')
resp = app.get(url)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 0 # invoice is now more available
invoice = con.invoice_set.get(regie_id=102, invoice_id=30)
assert invoice.maelis_no_more_returned_date.isoformat() == '2023-03-03T18:10:00+00:00'
assert invoice.status() == 'cancelled_by_agent'
# Maelis re-send the invoice
freezer.move_to('2023-03-03 18:20:00')
resp = app.get(url)
assert resp.json['err'] == 0
assert len(resp.json['data']) == 1
assert resp.json['data'][0]['online_payment'] is True
assert resp.json['data'][0]['no_online_payment_reason'] is None
invoice = con.invoice_set.get(regie_id=102, invoice_id=30)
assert invoice.maelis_no_more_returned_date is None
assert invoice.status() == 'created'
def test_invoice_if_cancelled(activity_service, invoice_service, con, app, freezer):
activity_service.add_soap_response('getFamilyBasket', get_xml_file('R_get_family_basket.xml'))
activity_service.add_soap_response('validateBasket', get_xml_file('R_validate_basket.xml'))
@ -10665,7 +11000,7 @@ def test_invoice_for_payment(activity_service, invoice_service, con, app, freeze
)
invoice_service.add_soap_response('readInvoices', get_xml_file('R_read_invoices_regie_109.xml'))
url = get_endpoint('regie/109/invoice/1312-18')
assert con.cancel_invoice_delay == 30
assert con.max_payment_delay == 20
# invoice created on validate basket
freezer.move_to('2023-03-03 18:30:00')
@ -10674,15 +11009,22 @@ def test_invoice_for_payment(activity_service, invoice_service, con, app, freeze
)
assert resp.json['err'] == 0
resp = app.get(url + '?NameID=ignored&for_payment')
# invoice is payable
resp = app.get(get_endpoint('regie/109/invoices') + '?family_id=1312')
assert resp.json['err'] == 0
assert '1312-18' in [x['id'] for x in resp.json['data']]
assert resp.json['data'][0]['online_payment'] is True
# notificate payment starts
freezer.move_to('2023-03-03 18:35:00')
resp = app.get(url + '?NameID=ignored&payment')
assert resp.json['err'] == 0
assert resp.json['data']['display_id'] == '18'
assert resp.json['data']['label'] == 'DSBL TEST'
# basket invoice is still returned but is no more payable
freezer.move_to('2023-03-03 18:50:00')
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.start_payment_date is not None
assert invoice.start_payment_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:35:00'
assert invoice.status() == 'for_payment'
resp = app.get(url + '?NameID=local')
assert resp.json['err'] == 0
@ -10691,7 +11033,21 @@ def test_invoice_for_payment(activity_service, invoice_service, con, app, freeze
assert resp.json['data']['online_payment'] is False
assert resp.json['data']['no_online_payment_reason'] == 'Transation de payement en cours'
# invoice is no more payable
resp = app.get(get_endpoint('regie/109/invoices') + '?family_id=1312')
assert resp.json['err'] == 0
assert '1312-18' in [x['id'] for x in resp.json['data']]
assert resp.json['data'][0]['online_payment'] is False
# start payment date is not updated on furter invoice call providing '?payment'
freezer.move_to('2023-03-03 18:40:00')
resp = app.get(get_endpoint('regie/109/invoice/1312-18') + '?payment')
assert resp.json['err'] == 0
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.start_payment_date.strftime('%Y-%m-%d %H:%M:%S') == '2023-03-03 18:35:00'
# basket invoice is no more returned since cancellation order sent to maelis
freezer.move_to('2023-03-03 18:55:00')
con.cancel_basket_invoices()
invoice = con.invoice_set.get(regie_id=109, invoice_id=18)
assert invoice.status() == 'cancelled'

View File

@ -493,6 +493,20 @@ def test_create_intervention_no_block(app, smart):
assert resp.json['err_desc'] == "'field2' field is required on 'coin' block"
@mock_response(
['/v1/type-intervention', None, INTERVENTION_TYPES],
)
def test_create_intervention_string_payload(app, smart):
payload = deepcopy(CREATE_INTERVENTION_PAYLOAD)
payload['fields']['coin_raw'] = 'plop'
resp = app.post_json(URL + 'create-intervention/', params=payload, status=400)
assert resp.json['err']
assert (
resp.json['err_desc']
== "cannot retrieve 'coin' block content from post data: got a <class 'str'> where a dict was expected"
)
@mock_response(
['/v1/type-intervention', None, INTERVENTION_TYPES],
)

138
tests/test_utils_defer.py Normal file
View File

@ -0,0 +1,138 @@
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import threading
from django.core.management import BaseCommand, call_command
from django.db import transaction
from passerelle.utils.defer import (
run_later,
run_later_if_in_transaction,
run_later_middleware,
run_later_scope,
)
def test_run_later():
x = []
def f():
x.append(1)
run_later(f)
assert x == [1]
with run_later_scope():
run_later(f)
assert x == [1]
assert x == [1, 1]
def test_run_later_nested():
x = []
def f():
x.append(1)
run_later(f)
assert x == [1]
with run_later_scope():
with run_later_scope():
run_later(f)
assert x == [1]
# f is run by the most enclosing scope
assert x == [1, 1]
def test_run_threading():
x = []
def f(i):
x.append(i)
run_later(f, 1)
assert x == [1]
with run_later_scope():
run_later(f, 2)
assert x == [1]
thread = threading.Thread(target=run_later, args=(f, 3))
thread.start()
thread.join()
assert x == [1, 3]
assert x == [1, 3, 2]
def test_run_later_if_in_transaction(transactional_db):
x = []
def f():
x.append(1)
run_later_if_in_transaction(f)
assert x == [1]
with run_later_scope():
run_later_if_in_transaction(f)
assert x == [1, 1]
with transaction.atomic():
run_later_if_in_transaction(f)
assert x == [1, 1]
assert x == [1, 1]
assert x == [1, 1, 1]
def test_middleware(rf):
x = []
def view1(request):
def f():
x.append(1)
assert x == []
run_later(f)
assert x == [1]
request = rf.get('/')
view1(request)
assert x == [1]
x = []
def view2(request):
def f():
x.append(1)
assert x == []
run_later(f)
assert x == []
run_later_middleware(view2)(request)
assert x == [1]
def test_base_command():
x = []
def f():
x.append(1)
class MyCommand(BaseCommand):
def handle(self, *args, **kwargs):
assert x == []
run_later(f)
assert x == []
call_command(MyCommand())
assert x == [1]

17
tox.ini
View File

@ -1,6 +1,6 @@
[tox]
toxworkdir = {env:TMPDIR:/tmp}/tox-{env:USER}/passerelle/{env:RAND_TEST:}
envlist = py3-django32-codestyle-coverage,pylint
envlist = py3-django32-codestyle-coverage,pylint,vitest
[testenv]
usedevelop = True
@ -72,7 +72,8 @@ deps =
WebTest
mock<4
httmock
pylint
pylint<3
astroid<3
pylint-django
django-webtest<1.9.3
pytest-freezegun
@ -81,5 +82,17 @@ deps =
mohawk
ldaptools
git+https://git.entrouvert.org/publik-django-templatetags.git
allowlist_externals =
./pylint.sh
commands =
./pylint.sh passerelle/ tests/
[testenv:vitest]
deps = nodeenv
allowlist_externals =
bash
npx
install_command = bash setup-vitest.sh {packages}
setenv =
NODE_PATH={envdir}/lib/node_modules
commands = npx vitest --run

15
vitest.config.js Normal file
View File

@ -0,0 +1,15 @@
import { fileURLToPath, URL } from 'node:url'
import { defineConfig } from 'vitest/config'
export default defineConfig({
test: {
include: ['tests/js/**/*.test.js'],
watchExclude: ['**'],
alias: {
qrcode: fileURLToPath(new URL('./passerelle/apps/qrcode/static/qrcode/js', import.meta.url)),
vitest: process.env.NODE_PATH + '/vitest'
},
environment: 'happy-dom'
}
})